Asyncio in Python - Full Tutorial
摘要
TLDRAsynchronous programming in Python, particularly with async I/O, allows tasks to begin without waiting for others to finish, increasing efficiency, especially for operations with long wait times like network or file access. Choosing the right concurrency model is crucial: async I/O for high-wait tasks, threads for shared data tasks, and processes for CPU-intensive tasks. A key structure is the event loop that manages task execution. Key concepts include coroutines, tasks, and synchronization tools such as locks, semaphores, and events. Coroutines, defined by 'async', return objects that need to be awaited to execute. Tasks enable concurrent execution of coroutines, which can be managed with functions like 'gather'. Synchronization primitives ensure orderly task execution and data integrity. Understanding these concepts is crucial for writing efficient asynchronous Python code.
心得
- 🚀 Asynchronous programming allows tasks to run independently, starting before other tasks finish, enhancing efficiency.
- 🕒 Async I/O is most beneficial for tasks with long wait times, such as network or file operations.
- 💼 Threads are ideal for IO-bound tasks that might also share data, running in parallel.
- ⚙️ Processes are suited for CPU-heavy tasks, using multiple cores for maximum performance.
- 🔄 The Event Loop in Python’s asyncio manages task distribution asynchronously, keeping the program efficient.
- 🔧 A coroutine function in Python involves async task execution, requiring 'await' to proceed.
- ✍️ Tasks allow scheduling of multiple coroutines, enabling concurrent operations for efficient performance.
- 📊 Gather function can run multiple awaitable tasks concurrently and collect the results in a list.
- 🔐 Locks and Semaphores in async programming control task access to shared resources, avoiding conflicts.
- 🛠️ Events serve as synchronization tools for simple task flagging operations.
时间轴
- 00:00:00 - 00:05:00
In traditional synchronous programming, tasks are handled linearly, causing delays if one task is halted. Asynchronous programming allows tasks to be executed concurrently, enhancing efficiency by not waiting unnecessarily. It's particularly useful for operations with inherent waiting times, such as network requests. The speaker outlines how async IIO, threads, and processes can be chosen based on task needs, highlighting the role of the event loop in managing and distributing tasks efficiently in asynchronous programming.
- 00:05:00 - 00:10:00
The creation and management of an event loop in Python's async IIO is discussed. It starts with importing the async IIO module and using async I.run with a coroutine function, which yields a coroutine object. The importance of awaiting coroutine objects for execution is emphasized. The speaker introduces the 'await' keyword for executing coroutines and uses examples to illustrate differences in execution timing depending on how and when coroutines are awaited.
- 00:10:00 - 00:15:00
The speaker explains tasks in asynchronous programming, highlighting how they schedule and run coroutines as soon as possible without needing to wait for prior tasks to complete. This optimizes efficiency by switching tasks when one is idle. Different methods like 'create task' and 'gather' are shown to run coroutines concurrently. The concept of a task group is introduced for organizing multiple tasks together, emphasizing error handling and task scheduling.
- 00:15:00 - 00:24:58
Synchronization primitives such as locks, semaphores, and events are introduced to manage access to shared resources in complex programs. Locks ensure only one coroutine accesses a critical section at a time. Semaphores allow limited concurrent access to resources. Events are described as simple boolean flags that block code execution until a condition is met. These tools help maintain organized and error-free asynchronous operations, with tasks managing concurrent executions efficiently.
思维导图
视频问答
What is asynchronous programming?
Asynchronous programming allows multiple tasks to be started and potentially run in parallel, without waiting for each task to finish before moving on to the next.
When should async I/O be used?
Async I/O is ideal for tasks involving long wait times, such as network requests or reading files, without much CPU usage.
What are Python coroutines?
Coroutines in Python are functions defined with 'async' that return coroutine objects, requiring 'await' to execute.
What role does the event loop play in async programming?
The event loop in Python's asyncio handles and distributes tasks efficiently by managing their execution asynchronously.
How do tasks differ from coroutines?
Tasks schedule coroutines to run as soon as possible, allowing multiple coroutines to run concurrently.
查看更多视频摘要
Stop Watching Manifestation Videos And Do This For REAL RESULTS
How Birth Control Pills Work, Animation
Breaking Free: Loretta's Journey to Inner Peace Through Sound Baths
CANADIAN vs AMERICAN PEANUT BUTTER! Comparing Kraft & Planters To Jif & Peter Pan Brands!
Polygon (MATIC) Coin Price Prediction as of 20 December 2024
We Made Our Friend An International Fugitive
- 00:00:00imagine programming is a journey from
- 00:00:02point A to D in traditional synchronous
- 00:00:04programming we travel in a straight line
- 00:00:07stopping at each point before moving to
- 00:00:09the next this means if there's a delay
- 00:00:11at any point everything pauses until we
- 00:00:13can move on now a synchronous
- 00:00:15programming changes the game it allows
- 00:00:18us to start tasks at b c and d even if
- 00:00:21the task at a isn't finished yet this is
- 00:00:23like sending out Scouts to explore
- 00:00:25multiple paths at once without waiting
- 00:00:28for the first Scout to return before
- 00:00:29sending out the next this way our
- 00:00:32program can handle multiple tasks
- 00:00:34simultaneously making it more efficient
- 00:00:36especially when dealing with operations
- 00:00:38that have waiting times like loading a
- 00:00:40web page and that's the essence of
- 00:00:42asynchronous programming making our code
- 00:00:44more efficient by doing multiple things
- 00:00:46at once without the unnecessary waiting
- 00:00:48so now let's quickly discuss when we
- 00:00:50should use async iio because when we
- 00:00:52build software choosing the right
- 00:00:54concurrency model and picking between
- 00:00:56asyn iio threads or processes is crucial
- 00:00:59for performance and efficiency now async
- 00:01:02iio is your choice for tasks that wait a
- 00:01:04lot like Network requests or reading
- 00:01:07files it excels in handling many tasks
- 00:01:09concurrently without using much CPU
- 00:01:12power this makes your application more
- 00:01:14efficient and responsive when you're
- 00:01:15waiting on a lot of different tasks now
- 00:01:18threads are suited for tasks that may
- 00:01:19need to wait but also share data they
- 00:01:22can run in parallel within the same
- 00:01:24application making them useful for tasks
- 00:01:26that are IO bound but less CPU intensive
- 00:01:29IO meaning input output now for CPU
- 00:01:32heavy tasks processes are the way to go
- 00:01:35each process operates independently
- 00:01:37maximizing CPU usage by running in
- 00:01:40parallel across multiple cores this is
- 00:01:42ideal for intensive computations in
- 00:01:45summary choose asyn iio for managing
- 00:01:47many waiting tasks efficiently threads
- 00:01:50for parallel tasks that share data with
- 00:01:52minimal CPU use and processes for
- 00:01:55maximizing performance on CPU intensive
- 00:01:58tasks now that we know when to use async
- 00:02:00iio let's dive into the five key
- 00:02:02Concepts that we need to understand the
- 00:02:04first concept is the event Loop in
- 00:02:07Python's async iio The Event Loop is the
- 00:02:09core that manages and distributes tasks
- 00:02:12think of it as a central Hub with tasks
- 00:02:14circling around it waiting for their
- 00:02:16turn to be executed each task takes its
- 00:02:18turn in the center where it's either
- 00:02:20executed immediately or paused if it's
- 00:02:22waiting for something like data from the
- 00:02:24internet when a task awaits it steps
- 00:02:27aside making room for another task to
- 00:02:29run ensuring the loop is always
- 00:02:31efficiently utilized once the awaited
- 00:02:34operation is complete the task will
- 00:02:36resume ensuring a smooth and responsive
- 00:02:38program flow and that's how async io's
- 00:02:41event Loop keeps your Python program
- 00:02:42running efficiently handling multiple
- 00:02:45tasks a synchronously so just a quick
- 00:02:47pause here for any of you that are
- 00:02:48serious about becoming software
- 00:02:50developers if you want to be like Max
- 00:02:52who landed a 70k per your job in Just 4
- 00:02:54months of work consider checking out my
- 00:02:57program with course careers now this
- 00:02:59teaches you the fun fundamentals of
- 00:03:00programming but also lets you pick a
- 00:03:02specialization taught by an industry
- 00:03:04expert in front end backend or devops
- 00:03:07beyond that we even help you prepare
- 00:03:09your resume we give you tips to optimize
- 00:03:11your LinkedIn profile how to prepare for
- 00:03:13interviews we really only succeed if our
- 00:03:15students actually get jobs that's the
- 00:03:18entire goal of the program so if that's
- 00:03:20at all of interest to you we do have a
- 00:03:21free introduction course that has a ton
- 00:03:23of value no obligation no strings
- 00:03:25attached you can check it out for free
- 00:03:27from the link in the description so now
- 00:03:29that we understand understand what the
- 00:03:30event Loop is it's time to look at how
- 00:03:32we create one and then talk about the
- 00:03:34next important concept which is co-
- 00:03:36routines now whenever we start writing
- 00:03:38asynchronous code in Python We Begin by
- 00:03:41importing the async io module now this
- 00:03:44is built into python you don't need to
- 00:03:45install it and for the purpose of this
- 00:03:47video I'll be referencing all of the
- 00:03:49features in Python version 3.11 and
- 00:03:52above so if you're using an older
- 00:03:53version of python just make sure you
- 00:03:55update it because some things have
- 00:03:56changed in the recent versions so we
- 00:03:59begin by the module then we use the
- 00:04:01command or the line async i.run and we
- 00:04:05pass to this something known as a
- 00:04:06co-routine function which will return a
- 00:04:09co- routine object now asyn i.run is
- 00:04:12going to start our event Loop and it's
- 00:04:14going to start that by running a co-
- 00:04:16routine now in our case there's two
- 00:04:18types of co- routines we're concerned
- 00:04:20with we have a co- routine function
- 00:04:23which is this right here and we have
- 00:04:25what's returned when you call a co-
- 00:04:28routine function I know it seems a bit
- 00:04:30strange but when you call Main like this
- 00:04:33when it's defined using this async
- 00:04:35keyword this returns to us something
- 00:04:37known as a co- routine object now the
- 00:04:41co-routine object is what we need to
- 00:04:43pass here to async i.run it's going to
- 00:04:46wait for that to finish and it's going
- 00:04:47to start the event Loop for us where it
- 00:04:50handles all of our asynchronous
- 00:04:51programming so recap import the module
- 00:04:55Define some asynchronous functions so
- 00:04:57async and then you write the function
- 00:04:59name out this is known as a co- routine
- 00:05:01function you then call the function and
- 00:05:03pass that to async i.run and that's
- 00:05:06going to start your event Loop and allow
- 00:05:08you to start running asynchronous code
- 00:05:10that starts from this entry point now to
- 00:05:12illustrate this a bit further let's look
- 00:05:14at the difference between an
- 00:05:15asynchronous function something defined
- 00:05:17with this async keyword and a normal
- 00:05:20function so watch what happens if I go
- 00:05:22here and I simply call this function
- 00:05:24some of you may assume that it's simply
- 00:05:26going to print out start of main Cod
- 00:05:28routine but you'll see that that's
- 00:05:29actually not the case I know that my
- 00:05:31terminal is a little bit messy here but
- 00:05:33it says co-routine main was never
- 00:05:36awaited now the reason we get that issue
- 00:05:39is because when we call the function
- 00:05:41here what we're actually doing is we're
- 00:05:43generating a co-routine object this
- 00:05:46co-routine object needs to be awaited in
- 00:05:48order for us to actually get the result
- 00:05:50of its execution now if we want to see
- 00:05:52this even more visually we can actually
- 00:05:54print out what we get when we call this
- 00:05:56main function so let's call it here and
- 00:05:59notice that we actually get this
- 00:06:00co-routine object so when you call a
- 00:06:03function defined with the async keyword
- 00:06:05it returns a co-routine object and that
- 00:06:07coroutine object needs to be awaited in
- 00:06:10order for it to actually execute so
- 00:06:12that's why we use the async i.run syntax
- 00:06:15because this will handle awaiting this
- 00:06:16Co routine and then allow us to write
- 00:06:19some more asynchronous code now the next
- 00:06:21thing that we need to look at is the
- 00:06:23await keyword now the await keyword is
- 00:06:25what we can use to await a coverou
- 00:06:27tetine and to actually allow it to
- 00:06:29execute and for us to get the result the
- 00:06:31thing is though we can only use this
- 00:06:32awake keyword inside of an asynchronous
- 00:06:35function or inside of a code routine so
- 00:06:37let's write another code routine and see
- 00:06:39how we would await it and how we get its
- 00:06:41result so now I've included a slightly
- 00:06:43more complex example where we're
- 00:06:44actually waiting on a different code
- 00:06:46routine just to see how that works so
- 00:06:49notice that we have a code routine up
- 00:06:50here and what this is aiming to do is
- 00:06:52simulate some input output bound
- 00:06:54operation now that could be going to the
- 00:06:56network and retrieving some data trying
- 00:06:59to read file something that's not
- 00:07:01controlled by our program that we're
- 00:07:02going to wait on the result from so in
- 00:07:05this case you can see that we fetch some
- 00:07:06data we delay so we're just sleeping for
- 00:07:09a certain amount of seconds just to
- 00:07:10simulate that input output bound
- 00:07:12operation we then get the data and we
- 00:07:14return it now we know this is a co-
- 00:07:16routine because we've defined it as an
- 00:07:18asynchronous function now remember that
- 00:07:20in order for a co- routine to actually
- 00:07:22be executed it needs to be awaited now
- 00:07:25in this case what we do is we create a
- 00:07:27task and this task is the co-routine
- 00:07:30object now the co-routine object at this
- 00:07:32point in time is not yet being executed
- 00:07:35and the reason it's not being executed
- 00:07:37yet is because it hasn't been awaited
- 00:07:39what I'm trying to show you is that when
- 00:07:40you call an asynchronous function it
- 00:07:43returns a co- routine that co- routine
- 00:07:45needs to be awaited before it will
- 00:07:47actually start executing so in this case
- 00:07:49here we now await the task when we await
- 00:07:51it it will start executing and we'll
- 00:07:54wait for it to finish before we move on
- 00:07:55to the rest of the code in our program
- 00:07:57so let's run the code and see what the
- 00:07:59output is here and you can see it says
- 00:08:01start of main code routine data fetched
- 00:08:04it then receives the results and it says
- 00:08:06the end of the main code routine now
- 00:08:08let's clear that and let's look at a
- 00:08:10slightly different example so let's take
- 00:08:13this result code right here and let me
- 00:08:15just get rid of this
- 00:08:16comment and let's put this actually at
- 00:08:20the end of this function so now what we
- 00:08:22have is print start of main co- routine
- 00:08:25we create the co- routine object we then
- 00:08:27print end of main routine object then we
- 00:08:30await the code routine and I just want
- 00:08:31to show you the difference in the result
- 00:08:33that we're going to get so let's run the
- 00:08:35code and notice we get start of main
- 00:08:37code routine end of main code routine
- 00:08:39and then we get fetching data data
- 00:08:41fetched and then we get the result now
- 00:08:44the reason we got this is because we
- 00:08:46only created the code routine object
- 00:08:48here we didn't yet await it so it wasn't
- 00:08:51until we hit this line right here that
- 00:08:53we waited for the execution of this to
- 00:08:55finish before moving on to the next line
- 00:08:58it's really important to understand that
- 00:09:00fact that a code routine doesn't start
- 00:09:02executing until it's awaited or until we
- 00:09:04wrap it in something like a task which
- 00:09:05we're going to look at later so I've
- 00:09:07made a slight variation to the last
- 00:09:09example and you can see what we're doing
- 00:09:11now is we're creating two different code
- 00:09:13routine objects and we're then awaiting
- 00:09:15them now I want you to pause the video
- 00:09:17and take a guess of what you think the
- 00:09:19output's going to be and how long you
- 00:09:21think it will take for this to execute
- 00:09:23go ahead pause the video I'm going to
- 00:09:25run the code now and explain what
- 00:09:27happens so when I run this if if we move
- 00:09:30it up here you'll see that we get
- 00:09:31fetching data id1 data fetched id1 we
- 00:09:35then receive the result and then we go
- 00:09:36ahead and we fetch it for id2 now let's
- 00:09:40clear this and run it one more time and
- 00:09:42you can see that it takes 2 seconds we
- 00:09:43fetch the first result it takes another
- 00:09:452 seconds and we fetch the second result
- 00:09:48now this might seem counterintuitive
- 00:09:50because you may have guessed that when
- 00:09:52we created these two coroutine objects
- 00:09:54they were going to start running
- 00:09:55concurrently and that means that it
- 00:09:56would only take us a total of 2 seconds
- 00:09:58and we'd immediately get both of the
- 00:10:00results but remember a code routine
- 00:10:02doesn't start running until it's awaited
- 00:10:05so in this case we actually wait for the
- 00:10:07first co- routine to finish and only
- 00:10:09once this has finished do we even start
- 00:10:11executing the second co- routine meaning
- 00:10:13that we haven't really got any
- 00:10:15performance benefit here we've just
- 00:10:17created a way to kind of wait for a task
- 00:10:19to be finished that's all we've really
- 00:10:21learned at this point in time now that
- 00:10:23we understand this concept we can move
- 00:10:25over and talk about tasks and see how we
- 00:10:27can actually speed up an operation ation
- 00:10:29like this and run both of these tasks or
- 00:10:32these co- routines at the same time so
- 00:10:34now we're moving on to the next
- 00:10:36important concept which is a task now a
- 00:10:38task is a way to schedule a co- routine
- 00:10:40to run as soon as possible and to allow
- 00:10:43us to run multiple co- routines
- 00:10:45simultaneously now the issue we saw
- 00:10:47previously is that we needed to wait for
- 00:10:49one co- routine to finish before we
- 00:10:51could start executing the next with a
- 00:10:53task we don't have that issue and as
- 00:10:55soon as a co- routine is sleeping or
- 00:10:57it's waiting on something that's not in
- 00:10:59control of our program we can move on
- 00:11:01and start executing another task we're
- 00:11:03never going to be executing these tasks
- 00:11:05at the exact same time we're not using
- 00:11:08multiple CPU cores but if one task isn't
- 00:11:11doing something if it's idle if it's
- 00:11:13blocked if it's waiting on something we
- 00:11:15can switch over and start working on
- 00:11:17another task the whole goal here is that
- 00:11:19our program is optimizing its efficiency
- 00:11:22so we're always attempting to do
- 00:11:24something and when we're waiting on
- 00:11:25something that's not in control of our
- 00:11:27program we switch over to another task
- 00:11:29and start working on that so here's a
- 00:11:31quick example that shows you how we
- 00:11:33would optimize kind of the previous
- 00:11:34example that we looked at what we do
- 00:11:36here is we use the simple create task
- 00:11:39function now there's a few other ways to
- 00:11:40make tasks which I'm going to show you
- 00:11:42in a second but this is the simplest
- 00:11:44what we do is we say task one is equal
- 00:11:46to asyn io. create task and then we pass
- 00:11:48in here a co-routine object it's a
- 00:11:51co-routine object because this is a
- 00:11:52co-routine function we call the function
- 00:11:54and that returns to us a co- routine so
- 00:11:56in this case we pass an ID then we pass
- 00:11:58some time delay now if this was running
- 00:12:01synchronously so if we had to wait for
- 00:12:03each of these tasks to run it would take
- 00:12:05us 2 seconds plus 3 seconds plus 1
- 00:12:07second so a total of 6 seconds for this
- 00:12:10code to execute however you'll see now
- 00:12:13that what will happen is we'll be able
- 00:12:14to execute this code in simply 3 seconds
- 00:12:17because as soon as one of the tasks is
- 00:12:18idle and we're waiting on this sleep we
- 00:12:21can go and execute or start another task
- 00:12:24now what I do is I still need to await
- 00:12:26these tasks to finish so I just await
- 00:12:28them all in line here and then collect
- 00:12:30all of their different results so let's
- 00:12:33bring the terminal up and let's run this
- 00:12:35code and make sure it works and notice
- 00:12:37that it starts all three Co routines
- 00:12:39pretty much immediately and then we get
- 00:12:41all of the data back at once in about 3
- 00:12:43seconds again that differs from if we
- 00:12:46were to use just the normal C routines
- 00:12:48and we didn't create a task we'd have to
- 00:12:50wait for each of them to finish before
- 00:12:51we can move on to the next one so as a
- 00:12:53quick recap when we create a task we're
- 00:12:56essentially scheduling a code routine to
- 00:12:57run as quickly as possible possible and
- 00:12:59we're allowing multiple Co routines to
- 00:13:01run at the same time as soon as one co-
- 00:13:04routine isn't doing something and it's
- 00:13:05waiting on some operation we can switch
- 00:13:07to another one and start executing that
- 00:13:10now all of that is handled by the event
- 00:13:12loop it's not something we need to
- 00:13:13manually take care of however if we do
- 00:13:15want to wait on one task to finish
- 00:13:17before moving to the next one we can use
- 00:13:19the await syntax so it would be possible
- 00:13:21for me to go here and write some code
- 00:13:24like this and now we would see if we
- 00:13:25execute the code and we can go ahead and
- 00:13:27do that that we'll start the first and
- 00:13:29the second code routine but we won't
- 00:13:31start the third one until the first and
- 00:13:33the second one are done so using a
- 00:13:35synchronous programming gives us that
- 00:13:37control and allows us to synchronize our
- 00:13:39code in whatever manner we see fit so
- 00:13:41now we move on to a quick example where
- 00:13:43I'm going to show you something known as
- 00:13:45The Gather function Now The Gather
- 00:13:47function is a quick way to concurrently
- 00:13:49run multiple co- routines just like we
- 00:13:51did manually before so rather than
- 00:13:53creating a task for every single one of
- 00:13:54the co- routines using that create task
- 00:13:57function we can simply use gather and it
- 00:13:59will automatically run these
- 00:14:01concurrently for us and collect the
- 00:14:03results in a list the way it works is
- 00:14:05that we pass multiple code routines in
- 00:14:07here as arguments these are
- 00:14:09automatically going to be scheduled to
- 00:14:10run concurrently so we don't need to
- 00:14:12wait for them to finish before we start
- 00:14:14executing the next one and then we will
- 00:14:16gather all of the results in a list in
- 00:14:18the order in which we provided the co-
- 00:14:20routines so the result of this one will
- 00:14:21be the first element in the list second
- 00:14:23element in the list third element in the
- 00:14:25list Etc and it's going to wait for all
- 00:14:27of them to finish when we use this await
- 00:14:30keyword which just simplifies this
- 00:14:31process for us that then allows us to
- 00:14:34have all of the results in one place so
- 00:14:35we can parse through them using this for
- 00:14:37Loop so let's go ahead and run this code
- 00:14:40and you see that it starts all three of
- 00:14:41our Co routines we wait 3 seconds and
- 00:14:43then we get all of our different results
- 00:14:45now one thing you should know about
- 00:14:47gather is that it's not that great at
- 00:14:49error handling and it's not going to
- 00:14:51automatically cancel other co- routines
- 00:14:53if one of them were to fail now the
- 00:14:55reason I'm bringing that up is because
- 00:14:57the next example I show you does
- 00:14:58actually provide some built-in error
- 00:15:00handling which means it's typically
- 00:15:02preferred over gather but it's just
- 00:15:04worth noting that if there is an error
- 00:15:05that occurs in one of these different
- 00:15:07code routines it won't cancel the other
- 00:15:09code routines which means you could get
- 00:15:11some weird state in your application if
- 00:15:13you're not manually handling the
- 00:15:14different exceptions and errors that
- 00:15:16could occur so now we're moving on to
- 00:15:17the last example in the topic of tasks
- 00:15:20where we're talking about something
- 00:15:21relatively new known as a task group now
- 00:15:24this is a slightly more preferred way to
- 00:15:25actually create multiple tasks and to
- 00:15:27organize them together and the reason
- 00:15:29for this is this provides some built-in
- 00:15:31error handling and if any of the tasks
- 00:15:33inside of our task groups were to fail
- 00:15:35it will automatically cancel all of the
- 00:15:37other tasks which is typically
- 00:15:39preferable when we are dealing with some
- 00:15:41Advanced errors or some larger
- 00:15:43applications where we want to be a bit
- 00:15:44more robust now the fetch data function
- 00:15:46has not changed at all all we've done
- 00:15:49here is we've started using async i.ask
- 00:15:51group now notice that what I'm using
- 00:15:53here is the async width now this is
- 00:15:55what's known as an asynchronous context
- 00:15:58manager you don't to understand that you
- 00:16:00don't have to have seen context managers
- 00:16:02before but what this does is give us
- 00:16:04access to this TG variable so we create
- 00:16:06a task group as TG and now to create a
- 00:16:09task we can say TG our task group.
- 00:16:12create task just like we did before in
- 00:16:14that first example we can create an
- 00:16:16individual task we can then add this to
- 00:16:19something like our tasks list if we care
- 00:16:21about the result of it and now once we
- 00:16:23get by this asynchronous width so once
- 00:16:26we get down here to where I have the
- 00:16:29comment what happens is all of these
- 00:16:31tasks will have already been executed so
- 00:16:34the idea is this is a little bit cleaner
- 00:16:36it's automatically going to execute all
- 00:16:38of the tasks that we add inside of the
- 00:16:40task group once all of those tasks have
- 00:16:42finished then this will stop blocking
- 00:16:45when I say stop blocking that means we
- 00:16:47can move down to the next line of code
- 00:16:49and at this point we can retrieve all of
- 00:16:50the different results from our tasks now
- 00:16:53there's various different ways to go
- 00:16:54about writing this type of code but the
- 00:16:56idea is you simply create a task here as
- 00:16:59soon as it's created inside of the task
- 00:17:01group we now need to wait for that and
- 00:17:03all the other tasks to finish before we
- 00:17:05unblock from this block of code then
- 00:17:08once they're all finished we move on to
- 00:17:09the next lines of code now similarly to
- 00:17:12any other task that we looked at before
- 00:17:14these are all going to run concurrently
- 00:17:16meaning if one task is sleeping we can
- 00:17:18go on and we can start another task and
- 00:17:19work on something else so those are
- 00:17:21tasks obviously there's a lot more you
- 00:17:23can do here but understand that you run
- 00:17:25tasks when you want to execute code
- 00:17:27concurrently and you want multiple
- 00:17:29different operations to be happening at
- 00:17:31the same time so now we're moving on to
- 00:17:33the fourth important concept which is a
- 00:17:35future now it's worth noting that a
- 00:17:37future is not something that you're
- 00:17:39expected to write on your own it's
- 00:17:41typically utilized in lower level
- 00:17:43libraries but it's good to just be
- 00:17:44familiar with the concept in case you
- 00:17:46see it in asynchronous programming so
- 00:17:48I'll go through this fairly quickly but
- 00:17:50really what a future is is a promise of
- 00:17:52a future result so all it's saying is
- 00:17:54that a result is to come in the future
- 00:17:57you don't know exactly when that's going
- 00:17:58to be that's all future is so in this
- 00:18:01case you can see that we actually create
- 00:18:02a future and we await its value what we
- 00:18:05do is we actually get the event Loop you
- 00:18:07don't need to do this you'll probably
- 00:18:09never write this type of code we create
- 00:18:11our own future we then have a new task
- 00:18:14that we create using async iio and you
- 00:18:16can see the task is set future result
- 00:18:19inside here we wait for 2 seconds so
- 00:18:21this is some blocking operation and then
- 00:18:24we set the result of the future and we
- 00:18:26print out the result here we AIT the
- 00:18:29future and then we print the result now
- 00:18:32notice we didn't actually await the task
- 00:18:34to finish we awaited the future object
- 00:18:37so inside of the task we set the value
- 00:18:40of the future and we awaited that which
- 00:18:43means as soon as we get the value of the
- 00:18:45future this task may or may not actually
- 00:18:47be complete so this is slightly
- 00:18:49different than using a task when we use
- 00:18:51a future we're just waiting for some
- 00:18:53value to be available we're not waiting
- 00:18:56for an entire task or an entire co-
- 00:18:58routine to finish that's all I really
- 00:19:00want to show you here I don't want to
- 00:19:01get into too many details that's a
- 00:19:03future really just a promise of an
- 00:19:04eventual result so now we're moving on
- 00:19:07and talking about synchronization
- 00:19:08Primitives now these are tools that
- 00:19:10allow us to synchronize the execution of
- 00:19:12various co- routines especially when we
- 00:19:14have larger more complicated programs
- 00:19:17now let's look at this example so we can
- 00:19:19understand how we use the first
- 00:19:20synchronization tool which is lock let's
- 00:19:23say that we have some shared resource
- 00:19:25maybe this is a database maybe it's a
- 00:19:27table maybe it's a file doesn't matter
- 00:19:29what it is but the idea is that it might
- 00:19:31take a fair amount of time for us to
- 00:19:33actually modify or do some operation on
- 00:19:35this shared resource and we want to make
- 00:19:37sure that no two co-routines are working
- 00:19:40on this at the same time the reason for
- 00:19:42that is if two co-routines were say
- 00:19:44modifying the same file if they're
- 00:19:45writing something to the database we
- 00:19:47could get some kind of error where we
- 00:19:49get a mutated state or just weird
- 00:19:52results end up occurring because we have
- 00:19:54kind of different operations happening
- 00:19:55at different times and they're
- 00:19:56simultaneously occurring when we want
- 00:19:58really wait for one entire operation to
- 00:20:00finish before the next one completes
- 00:20:03that might seem a little bit confusing
- 00:20:05but the idea is we have something and we
- 00:20:06want to lock it off and only be using it
- 00:20:09from one co- routine at a time so what
- 00:20:12we can do for that is we can create a
- 00:20:13lock now when we create a lock we have
- 00:20:16the ability to acquire the lock and we
- 00:20:18do that with this code right here which
- 00:20:20is async with lock now this again is an
- 00:20:23asynchronous context manager and what
- 00:20:25this will do is it will check if any
- 00:20:27other code routine is currently using
- 00:20:29the lock if it is it's going to wait
- 00:20:32until that code routine is finished if
- 00:20:34it's not it's going to go into this
- 00:20:35block of code now the idea is whatever
- 00:20:38we put inside of this context manager
- 00:20:40needs to finish executing before the
- 00:20:43lock will be released which means we can
- 00:20:45do some critical part of modification we
- 00:20:48can have some kind of code occurring in
- 00:20:49here that we know will happen all at
- 00:20:51once before we move on to a different
- 00:20:53task or to a different code routine the
- 00:20:56reason that's important is because we
- 00:20:57have something like an await maybe we're
- 00:20:59waiting a network operation to save
- 00:21:01something else that could trigger a
- 00:21:03different task to start running in this
- 00:21:05case we're saying hey within this lock
- 00:21:08wait for all of this to finish before we
- 00:21:10release the lock which means that even
- 00:21:12though another task could potentially be
- 00:21:14executing when the Sleep occurs it can't
- 00:21:16start executing this critical part of
- 00:21:19code until all of this is finished and
- 00:21:21the lock is released so all the lock is
- 00:21:24really doing is it's synchronizing our
- 00:21:26different co- routines so that they
- 00:21:28can't be using this block of code or
- 00:21:30executing this block of code while
- 00:21:32another code routine is executing it
- 00:21:34that's all it's doing it's locking off
- 00:21:36access to in this case a critical
- 00:21:38resource that we only want to be
- 00:21:39accessed one at a time so in this case
- 00:21:42you can see that we create five
- 00:21:43different instances of this Co routine
- 00:21:46we then are accessing the lock and then
- 00:21:48again once we get down here we're going
- 00:21:49to release it so if we bring up the
- 00:21:52terminal here and we start executing
- 00:21:54this you'll see that we have resource
- 00:21:55before modification resource after
- 00:21:58before after before after and the idea
- 00:22:00is even though that we've executed these
- 00:22:01cortines concurrently we're gating them
- 00:22:04off and we're locking their access to
- 00:22:06this resers so that only one can be
- 00:22:08accessing it at a time moving on the
- 00:22:10next synchronization primitive to cover
- 00:22:12is known as the semaphore now a
- 00:22:14semaphore is something that works very
- 00:22:16similarly to a lock however it allows
- 00:22:19multiple Cod routines to have access to
- 00:22:21the same object at the same time but we
- 00:22:23can decide how many we want that to be
- 00:22:26so in this case we create a semaphore
- 00:22:28and we give give it a limit of two that
- 00:22:30means only two co- routine story can
- 00:22:32access some resource at the exact same
- 00:22:34time and the reason we would do that is
- 00:22:36to make sure that we kind of throttle
- 00:22:38our program and we don't overload some
- 00:22:40kind of resource so it's possible that
- 00:22:42we're going to send a bunch of different
- 00:22:43network requests we can do a few of them
- 00:22:46at the same time but we can't do maybe a
- 00:22:48thousand or 10,000 at the same time so
- 00:22:50in that case we would create a semaphor
- 00:22:52we'd say okay our limit is maybe five at
- 00:22:54a time and this way now we have the
- 00:22:56event Loop automatically handled this
- 00:22:58throttle our code intentionally to only
- 00:23:00send maximum five requests at a time
- 00:23:03anyways let's bring up our terminal here
- 00:23:05and run this code so Python 3 semap 4.
- 00:23:09piy and you can see that we can access
- 00:23:10the resource kind of two at a time and
- 00:23:13modify it but we can't have any more
- 00:23:15than that now moving on to the last
- 00:23:17primitive we're going to talk about this
- 00:23:19is the event now the event is something
- 00:23:21that's a little bit more basic and
- 00:23:22allows us to do some simpler
- 00:23:24synchronization in this case we can
- 00:23:26create an event and what we can do is we
- 00:23:28can await the event to be set and we can
- 00:23:31set the event and this acts as a simple
- 00:23:33Boolean flag and it allows us to block
- 00:23:36other areas of our code until we've set
- 00:23:39this flag to be true so it's really just
- 00:23:41like setting a variable to true or false
- 00:23:43in this case it's just doing it in the
- 00:23:44asynchronous way so you can see we have
- 00:23:47some Setter function maybe it takes two
- 00:23:49seconds to be able to set some result we
- 00:23:51then set the result and as soon as that
- 00:23:53result has been set we can come up here
- 00:23:55we await that so we wait for this to
- 00:23:57finish and and then we can go ahead and
- 00:23:59print the event has been set continue
- 00:24:01execution so we can bring this up here
- 00:24:04and quickly have a look at this so
- 00:24:05Python 3 if we spell that correctly
- 00:24:08event. pi and you'll see it says
- 00:24:10awaiting the event to be set event has
- 00:24:12been set event has been set continuing
- 00:24:14the execution okay pretty
- 00:24:16straightforward it's just a Boolean flag
- 00:24:18that allows us to wait at certain points
- 00:24:20in our program there's lots of different
- 00:24:21times when you would want to use this
- 00:24:23but I just wanted to quickly show you
- 00:24:24that we do have something like this that
- 00:24:25exists now there is another type of
- 00:24:27primitive here that's a bit more
- 00:24:29complicated called the condition I'm not
- 00:24:31going to get into that in this video in
- 00:24:33fact I'm going to leave the video here
- 00:24:35if you guys enjoyed this make sure you
- 00:24:37leave a like subscribe to the channel
- 00:24:39and consider checking out my premium
- 00:24:40software development course with course
- 00:24:42careers if you enjoy this teaching style
- 00:24:44and you're serious about becoming a
- 00:24:46developer anyways I will see you guys in
- 00:24:48another YouTube
- 00:24:50[Music]
- 00:24:57video
- Asynchronous programming
- Python
- Concurrency
- Event Loop
- Coroutines
- Async I/O
- Threads
- Processes
- Synchronization
- Concurrency models