Coroutines and Asynchronous Programming in Python
This article explains Python coroutines, their advantages over threads, demonstrates producer‑consumer patterns, introduces the asyncio event loop, shows how to use async/await syntax, and provides examples of asynchronous network requests and a simple aiohttp server for high‑concurrency I/O.
Coroutines (also called micro‑threads) are sub‑routines that can be paused and resumed, allowing the execution flow to switch between them without using the operating‑system thread scheduler. Unlike traditional function calls that follow a strict stack‑based order, coroutines can interrupt their own execution and later continue, which makes their control flow harder to understand but enables very efficient concurrency.
The main advantages of coroutines over threads are:
Extremely low overhead because context switches are controlled by the program itself, not the OS.
No need for locks or other thread‑synchronisation mechanisms, since only one thread runs the coroutine code.
When combined with multiple processes, they can fully exploit multi‑core CPUs while retaining the high efficiency of coroutine scheduling.
In Python, coroutines are implemented using generators. A generator can yield values with yield and also receive values sent by the caller via send() . This makes it possible to build producer‑consumer pipelines without locks.
<code>def consumer():
r = ''
while True:
n = yield r
if not n:
return
print('[CONSUMER] Consuming %s...' % n)
r = '200 OK'
def produce(c):
c.send(None)
n = 0
while n < 5:
n = n + 1
print('[PRODUCER] Producing %s...' % n)
r = c.send(n)
print('[PRODUCER] Consumer return: %s' % r)
c.close()
c = consumer()
produce(c)</code>The output shows how the producer and consumer cooperate without any locking, illustrating the lock‑free nature of coroutine‑based pipelines.
Python’s asyncio library provides an event‑loop that schedules coroutines, enabling asynchronous I/O. A coroutine is marked with @asyncio.coroutine (Python 3.4) or with the async keyword (Python 3.5+). The yield from (or await ) expression pauses the coroutine until the awaited operation completes.
<code>import asyncio
@asyncio.coroutine
def hello():
print("Hello world!")
r = yield from asyncio.sleep(1)
print("Hello again!")
loop = asyncio.get_event_loop()
loop.run_until_complete(hello())
loop.close()</code>Multiple coroutines can be run concurrently by creating Task objects and passing them to asyncio.wait() :
<code>import threading, asyncio
@asyncio.coroutine
def hello():
print('Hello world! (%s)' % threading.currentThread())
yield from asyncio.sleep(1)
print('Hello again! (%s)' % threading.currentThread())
loop = asyncio.get_event_loop()
tasks = [hello(), hello()]
loop.run_until_complete(asyncio.wait(tasks))
loop.close()</code>Asyncio can also be used for network I/O. The following example fetches HTTP headers from three websites concurrently using a single thread:
<code>import asyncio
@asyncio.coroutine
def wget(host):
print('wget %s...' % host)
connect = asyncio.open_connection(host, 80)
reader, writer = yield from connect
header = 'GET / HTTP/1.0\r\nHost: %s\r\n\r\n' % host
writer.write(header.encode('utf-8'))
yield from writer.drain()
while True:
line = yield from reader.readline()
if line == b'\r\n':
break
print('%s header > %s' % (host, line.decode('utf-8').rstrip()))
writer.close()
loop = asyncio.get_event_loop()
tasks = [wget(host) for host in ['www.sina.com.cn', 'www.sohu.com', 'www.163.com']]
loop.run_until_complete(asyncio.wait(tasks))
loop.close()</code>Starting with Python 3.5, the newer async / await syntax simplifies coroutine definitions:
<code>async def hello():
print("Hello world!")
await asyncio.sleep(1)
print("Hello again!")</code>Finally, asyncio can be combined with the aiohttp library to build an asynchronous HTTP server that handles multiple requests in a single thread:
<code>import asyncio
from aiohttp import web
async def index(request):
await asyncio.sleep(0.5)
return web.Response(body=b'<h1>Index</h1>')
async def hello(request):
await asyncio.sleep(0.5)
text = '<h1>hello, %s!</h1>' % request.match_info['name']
return web.Response(body=text.encode('utf-8'))
async def init(loop):
app = web.Application(loop=loop)
app.router.add_route('GET', '/', index)
app.router.add_route('GET', '/hello/{name}', hello)
srv = await loop.create_server(app.make_handler(), '127.0.0.1', 8000)
print('Server started at http://127.0.0.1:8000...')
return srv
loop = asyncio.get_event_loop()
loop.run_until_complete(init(loop))
loop.run_forever()</code>This server demonstrates how a single‑threaded asyncio event loop can serve multiple HTTP clients concurrently without traditional threading or process pools.
Python Programming Learning Circle
A global community of Chinese Python developers offering technical articles, columns, original video tutorials, and problem sets. Topics include web full‑stack development, web scraping, data analysis, natural language processing, image processing, machine learning, automated testing, DevOps automation, and big data.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.