PEP 492 – Coroutines with async and await syntax
- PEP
- 492
- Title
- Coroutines with async and await syntax
- Author
- Yury Selivanov <yury at edgedb.com>
- Discussions-To
- python-dev@python.org
- Status
- Final
- Type
- Standards Track
- Created
- 09-Apr-2015
- Python-Version
- 3.5
- Post-History
- 17-Apr-2015, 21-Apr-2015, 27-Apr-2015, 29-Apr-2015, 05-May-2015
Contents
- Abstract
- API Design and Implementation Revisions
- Rationale and Goals
- Specification
- Glossary
- Transition Plan
- Design Considerations
- PEP 3152
- Coroutine-generators
- Why “async” and “await” keywords
- Why “__aiter__” does not return an awaitable
- Importance of “async” keyword
- Why “async def”
- Why not “await for” and “await with”
- Why “async def” and not “def async”
- Why not a __future__ import
- Why magic methods start with “a”
- Why not reuse existing magic names
- Why not reuse existing “for” and “with” statements
- Comprehensions
- Async lambda functions
- Performance
- Reference Implementation
- Acceptance
- Implementation
- References
- Acknowledgments
- Copyright
Abstract
The growth of Internet and general connectivity has triggered the proportionate need for responsive and scalable code. This proposal aims to answer that need by making writing explicitly asynchronous, concurrent Python code easier and more Pythonic.
It is proposed to make coroutines a proper standalone concept in Python, and introduce new supporting syntax. The ultimate goal is to help establish a common, easily approachable, mental model of asynchronous programming in Python and make it as close to synchronous programming as possible.
This PEP assumes that the asynchronous tasks are scheduled and
coordinated by an Event Loop similar to that of stdlib module
asyncio.events.AbstractEventLoop
. While the PEP is not tied to any
specific Event Loop implementation, it is relevant only to the kind of
coroutine that uses yield
as a signal to the scheduler, indicating
that the coroutine will be waiting until an event (such as IO) is
completed.
We believe that the changes proposed here will help keep Python relevant and competitive in a quickly growing area of asynchronous programming, as many other languages have adopted, or are planning to adopt, similar features: [2], [5], [6], [7], [8], [10].
API Design and Implementation Revisions
- Feedback on the initial beta release of Python 3.5 resulted in a
redesign of the object model supporting this PEP to more clearly
separate native coroutines from generators - rather than being a
new kind of generator, native coroutines are now their own
completely distinct type (implemented in [17]).
This change was implemented based primarily due to problems encountered attempting to integrate support for native coroutines into the Tornado web server (reported in [18]).
- In CPython 3.5.2, the
__aiter__
protocol was updated.Before 3.5.2,
__aiter__
was expected to return an awaitable resolving to an asynchronous iterator. Starting with 3.5.2,__aiter__
should return asynchronous iterators directly.If the old protocol is used in 3.5.2, Python will raise a
PendingDeprecationWarning
.In CPython 3.6, the old
__aiter__
protocol will still be supported with aDeprecationWarning
being raised.In CPython 3.7, the old
__aiter__
protocol will no longer be supported: aRuntimeError
will be raised if__aiter__
returns anything but an asynchronous iterator.
Rationale and Goals
Current Python supports implementing coroutines via generators (PEP
342), further enhanced by the yield from
syntax introduced in PEP
380. This approach has a number of shortcomings:
- It is easy to confuse coroutines with regular generators, since they share the same syntax; this is especially true for new developers.
- Whether or not a function is a coroutine is determined by a presence
of
yield
oryield from
statements in its body, which can lead to unobvious errors when such statements appear in or disappear from function body during refactoring. - Support for asynchronous calls is limited to expressions where
yield
is allowed syntactically, limiting the usefulness of syntactic features, such aswith
andfor
statements.
This proposal makes coroutines a native Python language feature, and clearly separates them from generators. This removes generator/coroutine ambiguity, and makes it possible to reliably define coroutines without reliance on a specific library. This also enables linters and IDEs to improve static code analysis and refactoring.
Native coroutines and the associated new syntax features make it
possible to define context manager and iteration protocols in
asynchronous terms. As shown later in this proposal, the new async
with
statement lets Python programs perform asynchronous calls when
entering and exiting a runtime context, and the new async for
statement makes it possible to perform asynchronous calls in iterators.
Specification
This proposal introduces new syntax and semantics to enhance coroutine support in Python.
This specification presumes knowledge of the implementation of coroutines in Python (PEP 342 and PEP 380). Motivation for the syntax changes proposed here comes from the asyncio framework (PEP 3156) and the “Cofunctions” proposal (PEP 3152, now rejected in favor of this specification).
From this point in this document we use the word native coroutine to refer to functions declared using the new syntax. generator-based coroutine is used where necessary to refer to coroutines that are based on generator syntax. coroutine is used in contexts where both definitions are applicable.
New Coroutine Declaration Syntax
The following new syntax is used to declare a native coroutine:
async def read_data(db):
pass
Key properties of coroutines:
async def
functions are always coroutines, even if they do not containawait
expressions.- It is a
SyntaxError
to haveyield
oryield from
expressions in anasync
function. - Internally, two new code object flags were introduced:
CO_COROUTINE
is used to mark native coroutines (defined with new syntax).CO_ITERABLE_COROUTINE
is used to make generator-based coroutines compatible with native coroutines (set by types.coroutine() function).
- Regular generators, when called, return a generator object; similarly, coroutines return a coroutine object.
StopIteration
exceptions are not propagated out of coroutines, and are replaced with aRuntimeError
. For regular generators such behavior requires a future import (see PEP 479).- When a native coroutine is garbage collected, a
RuntimeWarning
is raised if it was never awaited on (see also Debugging Features). - See also Coroutine objects section.
types.coroutine()
A new function coroutine(fn)
is added to the types
module. It
allows interoperability between existing generator-based coroutines
in asyncio and native coroutines introduced by this PEP:
@types.coroutine
def process_data(db):
data = yield from read_data(db)
...
The function applies CO_ITERABLE_COROUTINE
flag to generator-
function’s code object, making it return a coroutine object.
If fn
is not a generator function, it is wrapped. If it returns
a generator, it will be wrapped in an awaitable proxy object
(see below the definition of awaitable objects).
Note, that the CO_COROUTINE
flag is not applied by
types.coroutine()
to make it possible to separate native
coroutines defined with new syntax, from generator-based coroutines.
Await Expression
The following new await
expression is used to obtain a result of
coroutine execution:
async def read_data(db):
data = await db.fetch('SELECT ...')
...
await
, similarly to yield from
, suspends execution of
read_data
coroutine until db.fetch
awaitable completes and
returns the result data.
It uses the yield from
implementation with an extra step of
validating its argument. await
only accepts an awaitable, which
can be one of:
- A native coroutine object returned from a native coroutine function.
- A generator-based coroutine object returned from a function
decorated with
types.coroutine()
. - An object with an
__await__
method returning an iterator.Any
yield from
chain of calls ends with ayield
. This is a fundamental mechanism of how Futures are implemented. Since, internally, coroutines are a special kind of generators, everyawait
is suspended by ayield
somewhere down the chain ofawait
calls (please refer to PEP 3156 for a detailed explanation).To enable this behavior for coroutines, a new magic method called
__await__
is added. In asyncio, for instance, to enable Future objects inawait
statements, the only change is to add__await__ = __iter__
line toasyncio.Future
class.Objects with
__await__
method are called Future-like objects in the rest of this PEP.It is a
TypeError
if__await__
returns anything but an iterator. - Objects defined with CPython C API with a
tp_as_async.am_await
function, returning an iterator (similar to__await__
method).
It is a SyntaxError
to use await
outside of an async def
function (like it is a SyntaxError
to use yield
outside of
def
function).
It is a TypeError
to pass anything other than an awaitable object
to an await
expression.
Updated operator precedence table
await
keyword is defined as follows:
power ::= await ["**" u_expr]
await ::= ["await"] primary
where “primary” represents the most tightly bound operations of the language. Its syntax is:
primary ::= atom | attributeref | subscription | slicing | call
See Python Documentation [12] and Grammar Updates section of this proposal for details.
The key await
difference from yield
and yield from
operators is that await expressions do not require parentheses around
them most of the times.
Also, yield from
allows any expression as its argument, including
expressions like yield from a() + b()
, that would be parsed as
yield from (a() + b())
, which is almost always a bug. In general,
the result of any arithmetic operation is not an awaitable object.
To avoid this kind of mistakes, it was decided to make await
precedence lower than []
, ()
, and .
, but higher than **
operators.
Operator | Description |
---|---|
yield x ,
yield from x |
Yield expression |
lambda |
Lambda expression |
if – else |
Conditional expression |
or |
Boolean OR |
and |
Boolean AND |
not x |
Boolean NOT |
in , not in ,
is , is not , < ,
<= , > , >= ,
!= , == |
Comparisons, including membership tests and identity tests |
| |
Bitwise OR |
^ |
Bitwise XOR |
& |
Bitwise AND |
<< , >> |
Shifts |
+ , - |
Addition and subtraction |
* , @ , / , // ,
% |
Multiplication, matrix multiplication, division, remainder |
+x , -x , ~x |
Positive, negative, bitwise NOT |
** |
Exponentiation |
await x |
Await expression |
x[index] ,
x[index:index] ,
x(arguments...) ,
x.attribute |
Subscription, slicing, call, attribute reference |
(expressions...) ,
[expressions...] ,
{key: value...} ,
{expressions...} |
Binding or tuple display, list display, dictionary display, set display |
Examples of “await” expressions
Valid syntax examples:
Expression | Will be parsed as |
---|---|
if await fut: pass |
if (await fut): pass |
if await fut + 1: pass |
if (await fut) + 1: pass |
pair = await fut, 'spam' |
pair = (await fut), 'spam' |
with await fut, open(): pass |
with (await fut), open(): pass |
await foo()['spam'].baz()() |
await ( foo()['spam'].baz()() ) |
return await coro() |
return ( await coro() ) |
res = await coro() ** 2 |
res = (await coro()) ** 2 |
func(a1=await coro(), a2=0) |
func(a1=(await coro()), a2=0) |
await foo() + await bar() |
(await foo()) + (await bar()) |
-await foo() |
-(await foo()) |
Invalid syntax examples:
Expression | Should be written as |
---|---|
await await coro() |
await (await coro()) |
await -coro() |
await (-coro()) |
Asynchronous Context Managers and “async with”
An asynchronous context manager is a context manager that is able to suspend execution in its enter and exit methods.
To make this possible, a new protocol for asynchronous context managers
is proposed. Two new magic methods are added: __aenter__
and
__aexit__
. Both must return an awaitable.
An example of an asynchronous context manager:
class AsyncContextManager:
async def __aenter__(self):
await log('entering context')
async def __aexit__(self, exc_type, exc, tb):
await log('exiting context')
New Syntax
A new statement for asynchronous context managers is proposed:
async with EXPR as VAR:
BLOCK
which is semantically equivalent to:
mgr = (EXPR)
aexit = type(mgr).__aexit__
aenter = type(mgr).__aenter__
VAR = await aenter(mgr)
try:
BLOCK
except:
if not await aexit(mgr, *sys.exc_info()):
raise
else:
await aexit(mgr, None, None, None)
As with regular with
statements, it is possible to specify multiple
context managers in a single async with
statement.
It is an error to pass a regular context manager without __aenter__
and __aexit__
methods to async with
. It is a SyntaxError
to use async with
outside of an async def
function.
Example
With asynchronous context managers it is easy to implement proper database transaction managers for coroutines:
async def commit(session, data):
...
async with session.transaction():
...
await session.update(data)
...
Code that needs locking also looks lighter:
async with lock:
...
instead of:
with (yield from lock):
...
Asynchronous Iterators and “async for”
An asynchronous iterable is able to call asynchronous code in its iter implementation, and asynchronous iterator can call asynchronous code in its next method. To support asynchronous iteration:
- An object must implement an
__aiter__
method (or, if defined with CPython C API,tp_as_async.am_aiter
slot) returning an asynchronous iterator object. - An asynchronous iterator object must implement an
__anext__
method (or, if defined with CPython C API,tp_as_async.am_anext
slot) returning an awaitable. - To stop iteration
__anext__
must raise aStopAsyncIteration
exception.
An example of asynchronous iterable:
class AsyncIterable:
def __aiter__(self):
return self
async def __anext__(self):
data = await self.fetch_data()
if data:
return data
else:
raise StopAsyncIteration
async def fetch_data(self):
...
New Syntax
A new statement for iterating through asynchronous iterators is proposed:
async for TARGET in ITER:
BLOCK
else:
BLOCK2
which is semantically equivalent to:
iter = (ITER)
iter = type(iter).__aiter__(iter)
running = True
while running:
try:
TARGET = await type(iter).__anext__(iter)
except StopAsyncIteration:
running = False
else:
BLOCK
else:
BLOCK2
It is a TypeError
to pass a regular iterable without __aiter__
method to async for
. It is a SyntaxError
to use async for
outside of an async def
function.
As for with regular for
statement, async for
has an optional
else
clause.
Example 1
With asynchronous iteration protocol it is possible to asynchronously buffer data during iteration:
async for data in cursor:
...
Where cursor
is an asynchronous iterator that prefetches N
rows
of data from a database after every N
iterations.
The following code illustrates new asynchronous iteration protocol:
class Cursor:
def __init__(self):
self.buffer = collections.deque()
async def _prefetch(self):
...
def __aiter__(self):
return self
async def __anext__(self):
if not self.buffer:
self.buffer = await self._prefetch()
if not self.buffer:
raise StopAsyncIteration
return self.buffer.popleft()
then the Cursor
class can be used as follows:
async for row in Cursor():
print(row)
which would be equivalent to the following code:
i = Cursor().__aiter__()
while True:
try:
row = await i.__anext__()
except StopAsyncIteration:
break
else:
print(row)
Example 2
The following is a utility class that transforms a regular iterable to an asynchronous one. While this is not a very useful thing to do, the code illustrates the relationship between regular and asynchronous iterators.
class AsyncIteratorWrapper:
def __init__(self, obj):
self._it = iter(obj)
def __aiter__(self):
return self
async def __anext__(self):
try:
value = next(self._it)
except StopIteration:
raise StopAsyncIteration
return value
async for letter in AsyncIteratorWrapper("abc"):
print(letter)
Why StopAsyncIteration?
Coroutines are still based on generators internally. So, before PEP 479, there was no fundamental difference between
def g1():
yield from fut
return 'spam'
and
def g2():
yield from fut
raise StopIteration('spam')
And since PEP 479 is accepted and enabled by default for coroutines,
the following example will have its StopIteration
wrapped into a
RuntimeError
async def a1():
await fut
raise StopIteration('spam')
The only way to tell the outside code that the iteration has ended is
to raise something other than StopIteration
. Therefore, a new
built-in exception class StopAsyncIteration
was added.
Moreover, with semantics from PEP 479, all StopIteration
exceptions
raised in coroutines are wrapped in RuntimeError
.
Coroutine objects
Differences from generators
This section applies only to native coroutines with CO_COROUTINE
flag, i.e. defined with the new async def
syntax.
The behavior of existing *generator-based coroutines* in asyncio remains unchanged.
Great effort has been made to make sure that coroutines and generators are treated as distinct concepts:
- Native coroutine objects do not implement
__iter__
and__next__
methods. Therefore, they cannot be iterated over or passed toiter()
,list()
,tuple()
and other built-ins. They also cannot be used in afor..in
loop.An attempt to use
__iter__
or__next__
on a native coroutine object will result in aTypeError
. - Plain generators cannot
yield from
native coroutines: doing so will result in aTypeError
. - generator-based coroutines (for asyncio code must be decorated
with
@asyncio.coroutine
) canyield from
native coroutine objects. inspect.isgenerator()
andinspect.isgeneratorfunction()
returnFalse
for native coroutine objects and native coroutine functions.
Coroutine object methods
Coroutines are based on generators internally, thus they share the
implementation. Similarly to generator objects, coroutines have
throw()
, send()
and close()
methods. StopIteration
and
GeneratorExit
play the same role for coroutines (although
PEP 479 is enabled by default for coroutines). See PEP 342, PEP 380,
and Python Documentation [11] for details.
throw()
, send()
methods for coroutines are used to push
values and raise errors into Future-like objects.
Debugging Features
A common beginner mistake is forgetting to use yield from
on
coroutines:
@asyncio.coroutine
def useful():
asyncio.sleep(1) # this will do nothing without 'yield from'
For debugging this kind of mistakes there is a special debug mode in
asyncio, in which @coroutine
decorator wraps all functions with a
special object with a destructor logging a warning. Whenever a wrapped
generator gets garbage collected, a detailed logging message is
generated with information about where exactly the decorator function
was defined, stack trace of where it was collected, etc. Wrapper
object also provides a convenient __repr__
function with detailed
information about the generator.
The only problem is how to enable these debug capabilities. Since
debug facilities should be a no-op in production mode, @coroutine
decorator makes the decision of whether to wrap or not to wrap based on
an OS environment variable PYTHONASYNCIODEBUG
. This way it is
possible to run asyncio programs with asyncio’s own functions
instrumented. EventLoop.set_debug
, a different debug facility, has
no impact on @coroutine
decorator’s behavior.
With this proposal, coroutines is a native, distinct from generators,
concept. In addition to a RuntimeWarning
being raised on
coroutines that were never awaited, it is proposed to add two new
functions to the sys
module: set_coroutine_wrapper
and
get_coroutine_wrapper
. This is to enable advanced debugging
facilities in asyncio and other frameworks (such as displaying where
exactly coroutine was created, and a more detailed stack trace of where
it was garbage collected).
New Standard Library Functions
types.coroutine(gen)
. See types.coroutine() section for details.inspect.iscoroutine(obj)
returnsTrue
ifobj
is a native coroutine object.inspect.iscoroutinefunction(obj)
returnsTrue
ifobj
is a native coroutine function.inspect.isawaitable(obj)
returnsTrue
ifobj
is an awaitable.inspect.getcoroutinestate(coro)
returns the current state of a native coroutine object (mirrorsinspect.getfgeneratorstate(gen)
).inspect.getcoroutinelocals(coro)
returns the mapping of a native coroutine object’s local variables to their values (mirrorsinspect.getgeneratorlocals(gen)
).sys.set_coroutine_wrapper(wrapper)
allows to intercept creation of native coroutine objects.wrapper
must be either a callable that accepts one argument (a coroutine object), orNone
.None
resets the wrapper. If called twice, the new wrapper replaces the previous one. The function is thread-specific. See Debugging Features for more details.sys.get_coroutine_wrapper()
returns the current wrapper object. ReturnsNone
if no wrapper was set. The function is thread-specific. See Debugging Features for more details.
New Abstract Base Classes
In order to allow better integration with existing frameworks (such as Tornado, see [13]) and compilers (such as Cython, see [16]), two new Abstract Base Classes (ABC) are added:
collections.abc.Awaitable
ABC for Future-like classes, that implement__await__
method.collections.abc.Coroutine
ABC for coroutine objects, that implementsend(value)
,throw(type, exc, tb)
,close()
and__await__()
methods.Note that generator-based coroutines with
CO_ITERABLE_COROUTINE
flag do not implement__await__
method, and therefore are not instances ofcollections.abc.Coroutine
andcollections.abc.Awaitable
ABCs:@types.coroutine def gencoro(): yield assert not isinstance(gencoro(), collections.abc.Coroutine) # however: assert inspect.isawaitable(gencoro())
To allow easy testing if objects support asynchronous iteration, two more ABCs are added:
collections.abc.AsyncIterable
– tests for__aiter__
method.collections.abc.AsyncIterator
– tests for__aiter__
and__anext__
methods.
Glossary
- Native coroutine function
- A coroutine function is declared with
async def
. It usesawait
andreturn value
; see New Coroutine Declaration Syntax for details. - Native coroutine
- Returned from a native coroutine function. See Await Expression for details.
- Generator-based coroutine function
- Coroutines based on generator syntax. Most common example are
functions decorated with
@asyncio.coroutine
. - Generator-based coroutine
- Returned from a generator-based coroutine function.
- Coroutine
- Either native coroutine or generator-based coroutine.
- Coroutine object
- Either native coroutine object or generator-based coroutine object.
- Future-like object
- An object with an
__await__
method, or a C object withtp_as_async->am_await
function, returning an iterator. Can be consumed by anawait
expression in a coroutine. A coroutine waiting for a Future-like object is suspended until the Future-like object’s__await__
completes, and returns the result. See Await Expression for details. - Awaitable
- A Future-like object or a coroutine object. See Await Expression for details.
- Asynchronous context manager
- An asynchronous context manager has
__aenter__
and__aexit__
methods and can be used withasync with
. See Asynchronous Context Managers and “async with” for details. - Asynchronous iterable
- An object with an
__aiter__
method, which must return an asynchronous iterator object. Can be used withasync for
. See Asynchronous Iterators and “async for” for details. - Asynchronous iterator
- An asynchronous iterator has an
__anext__
method. See Asynchronous Iterators and “async for” for details.
Transition Plan
To avoid backwards compatibility issues with async
and await
keywords, it was decided to modify tokenizer.c
in such a way, that
it:
- recognizes
async def
NAME
tokens combination; - while tokenizing
async def
block, it replaces'async'
NAME
token withASYNC
, and'await'
NAME
token withAWAIT
; - while tokenizing
def
block, it yields'async'
and'await'
NAME
tokens as is.
This approach allows for seamless combination of new syntax features
(all of them available only in async
functions) with any existing
code.
An example of having “async def” and “async” attribute in one piece of code:
class Spam:
async = 42
async def ham():
print(getattr(Spam, 'async'))
# The coroutine can be executed and will print '42'
Backwards Compatibility
This proposal preserves 100% backwards compatibility.
asyncio
asyncio
module was adapted and tested to work with coroutines and
new statements. Backwards compatibility is 100% preserved, i.e. all
existing code will work as-is.
The required changes are mainly:
- Modify
@asyncio.coroutine
decorator to use newtypes.coroutine()
function. - Add
__await__ = __iter__
line toasyncio.Future
class. - Add
ensure_future()
as an alias forasync()
function. Deprecateasync()
function.
asyncio migration strategy
Because plain generators cannot yield from
native coroutine
objects (see Differences from generators section for more details),
it is advised to make sure that all generator-based coroutines are
decorated with @asyncio.coroutine
before starting to use the new
syntax.
async/await in CPython code base
There is no use of await
names in CPython.
async
is mostly used by asyncio. We are addressing this by
renaming async()
function to ensure_future()
(see asyncio
section for details).
Another use of async
keyword is in Lib/xml/dom/xmlbuilder.py
,
to define an async = False
attribute for DocumentLS
class.
There is no documentation or tests for it, it is not used anywhere else
in CPython. It is replaced with a getter, that raises a
DeprecationWarning
, advising to use async_
attribute instead.
‘async’ attribute is not documented and is not used in CPython code
base.
Grammar Updates
Grammar changes are fairly minimal:
decorated: decorators (classdef | funcdef | async_funcdef)
async_funcdef: ASYNC funcdef
compound_stmt: (if_stmt | while_stmt | for_stmt | try_stmt | with_stmt
| funcdef | classdef | decorated | async_stmt)
async_stmt: ASYNC (funcdef | with_stmt | for_stmt)
power: atom_expr ['**' factor]
atom_expr: [AWAIT] atom trailer*
Deprecation Plans
async
and await
names will be softly deprecated in CPython 3.5
and 3.6. In 3.7 we will transform them to proper keywords. Making
async
and await
proper keywords before 3.7 might make it harder
for people to port their code to Python 3.
Design Considerations
PEP 3152
PEP 3152 by Gregory Ewing proposes a different mechanism for coroutines (called “cofunctions”). Some key points:
- A new keyword
codef
to declare a cofunction. Cofunction is always a generator, even if there is nococall
expressions inside it. Maps toasync def
in this proposal. - A new keyword
cocall
to call a cofunction. Can only be used inside a cofunction. Maps toawait
in this proposal (with some differences, see below). - It is not possible to call a cofunction without a
cocall
keyword. cocall
grammatically requires parentheses after it:atom: cocall | <existing alternatives for atom> cocall: 'cocall' atom cotrailer* '(' [arglist] ')' cotrailer: '[' subscriptlist ']' | '.' NAME
cocall f(*args, **kwds)
is semantically equivalent toyield from f.__cocall__(*args, **kwds)
.
Differences from this proposal:
- There is no equivalent of
__cocall__
in this PEP, which is called and its result is passed toyield from
in thecocall
expression.await
keyword expects an awaitable object, validates the type, and executesyield from
on it. Although,__await__
method is similar to__cocall__
, but is only used to define Future-like objects. await
is defined in almost the same way asyield from
in the grammar (it is later enforced thatawait
can only be insideasync def
). It is possible to simply writeawait future
, whereascocall
always requires parentheses.- To make asyncio work with PEP 3152 it would be required to modify
@asyncio.coroutine
decorator to wrap all functions in an object with a__cocall__
method, or to implement__cocall__
on generators. To call cofunctions from existing generator-based coroutines it would be required to usecostart(cofunc, *args, **kwargs)
built-in. - Since it is impossible to call a cofunction without a
cocall
keyword, it automatically prevents the common mistake of forgetting to useyield from
on generator-based coroutines. This proposal addresses this problem with a different approach, see Debugging Features. - A shortcoming of requiring a
cocall
keyword to call a coroutine is that if is decided to implement coroutine-generators – coroutines withyield
orasync yield
expressions – we wouldn’t need acocall
keyword to call them. So we’ll end up having__cocall__
and no__call__
for regular coroutines, and having__call__
and no__cocall__
for coroutine- generators. - Requiring parentheses grammatically also introduces a whole lot
of new problems.
The following code:
await fut await function_returning_future() await asyncio.gather(coro1(arg1, arg2), coro2(arg1, arg2))
would look like:
cocall fut() # or cocall costart(fut) cocall (function_returning_future())() cocall asyncio.gather(costart(coro1, arg1, arg2), costart(coro2, arg1, arg2))
- There are no equivalents of
async for
andasync with
in PEP 3152.
Coroutine-generators
With async for
keyword it is desirable to have a concept of a
coroutine-generator – a coroutine with yield
and yield from
expressions. To avoid any ambiguity with regular generators, we would
likely require to have an async
keyword before yield
, and
async yield from
would raise a StopAsyncIteration
exception.
While it is possible to implement coroutine-generators, we believe that they are out of scope of this proposal. It is an advanced concept that should be carefully considered and balanced, with a non-trivial changes in the implementation of current generator objects. This is a matter for a separate PEP.
Why “async” and “await” keywords
async/await is not a new concept in programming languages:
- C# has it since long time ago [5];
- proposal to add async/await in ECMAScript 7 [2]; see also Traceur project [9];
- Facebook’s Hack/HHVM [6];
- Google’s Dart language [7];
- Scala [8];
- proposal to add async/await to C++ [10];
- and many other less popular languages.
This is a huge benefit, as some users already have experience with async/await, and because it makes working with many languages in one project easier (Python with ECMAScript 7 for instance).
Why “__aiter__” does not return an awaitable
PEP 492 was accepted in CPython 3.5.0 with __aiter__
defined as
a method, that was expected to return an awaitable resolving to an
asynchronous iterator.
In 3.5.2 (as PEP 492 was accepted on a provisional basis) the
__aiter__
protocol was updated to return asynchronous iterators
directly.
The motivation behind this change is to make it possible to implement asynchronous generators in Python. See [19] and [20] for more details.
Importance of “async” keyword
While it is possible to just implement await
expression and treat
all functions with at least one await
as coroutines, this approach
makes APIs design, code refactoring and its long time support harder.
Let’s pretend that Python only has await
keyword:
def useful():
...
await log(...)
...
def important():
await useful()
If useful()
function is refactored and someone removes all
await
expressions from it, it would become a regular python
function, and all code that depends on it, including important()
would be broken. To mitigate this issue a decorator similar to
@asyncio.coroutine
has to be introduced.
Why “async def”
For some people bare async name(): pass
syntax might look more
appealing than async def name(): pass
. It is certainly easier to
type. But on the other hand, it breaks the symmetry between async
def
, async with
and async for
, where async
is a modifier,
stating that the statement is asynchronous. It is also more consistent
with the existing grammar.
Why not “await for” and “await with”
async
is an adjective, and hence it is a better choice for a
statement qualifier keyword. await for/with
would imply that
something is awaiting for a completion of a for
or with
statement.
Why “async def” and not “def async”
async
keyword is a statement qualifier. A good analogy to it are
“static”, “public”, “unsafe” keywords from other languages. “async
for” is an asynchronous “for” statement, “async with” is an
asynchronous “with” statement, “async def” is an asynchronous function.
Having “async” after the main statement keyword might introduce some confusion, like “for async item in iterator” can be read as “for each asynchronous item in iterator”.
Having async
keyword before def
, with
and for
also
makes the language grammar simpler. And “async def” better separates
coroutines from regular functions visually.
Why not a __future__ import
Transition Plan section explains how tokenizer is modified to treat
async
and await
as keywords only in async def
blocks.
Hence async def
fills the role that a module level compiler
declaration like from __future__ import async_await
would otherwise
fill.
Why magic methods start with “a”
New asynchronous magic methods __aiter__
, __anext__
,
__aenter__
, and __aexit__
all start with the same prefix “a”.
An alternative proposal is to use “async” prefix, so that __anext__
becomes __async_next__
. However, to align new magic methods with
the existing ones, such as __radd__
and __iadd__
it was decided
to use a shorter version.
Why not reuse existing magic names
An alternative idea about new asynchronous iterators and context
managers was to reuse existing magic methods, by adding an async
keyword to their declarations:
class CM:
async def __enter__(self): # instead of __aenter__
...
This approach has the following downsides:
- it would not be possible to create an object that works in both
with
andasync with
statements; - it would break backwards compatibility, as nothing prohibits from
returning a Future-like objects from
__enter__
and/or__exit__
in Python <= 3.4; - one of the main points of this proposal is to make native coroutines as simple and foolproof as possible, hence the clear separation of the protocols.
Why not reuse existing “for” and “with” statements
The vision behind existing generator-based coroutines and this proposal is to make it easy for users to see where the code might be suspended. Making existing “for” and “with” statements to recognize asynchronous iterators and context managers will inevitably create implicit suspend points, making it harder to reason about the code.
Comprehensions
Syntax for asynchronous comprehensions could be provided, but this construct is outside of the scope of this PEP.
Async lambda functions
Syntax for asynchronous lambda functions could be provided, but this construct is outside of the scope of this PEP.
Performance
Overall Impact
This proposal introduces no observable performance impact. Here is an output of python’s official set of benchmarks [4]:
python perf.py -r -b default ../cpython/python.exe ../cpython-aw/python.exe
[skipped]
Report on Darwin ysmac 14.3.0 Darwin Kernel Version 14.3.0:
Mon Mar 23 11:59:05 PDT 2015; root:xnu-2782.20.48~5/RELEASE_X86_64
x86_64 i386
Total CPU cores: 8
### etree_iterparse ###
Min: 0.365359 -> 0.349168: 1.05x faster
Avg: 0.396924 -> 0.379735: 1.05x faster
Significant (t=9.71)
Stddev: 0.01225 -> 0.01277: 1.0423x larger
The following not significant results are hidden, use -v to show them:
django_v2, 2to3, etree_generate, etree_parse, etree_process, fastpickle,
fastunpickle, json_dump_v2, json_load, nbody, regex_v8, tornado_http.
Tokenizer modifications
There is no observable slowdown of parsing python files with the
modified tokenizer: parsing of one 12Mb file
(Lib/test/test_binop.py
repeated 1000 times) takes the same amount
of time.
async/await
The following micro-benchmark was used to determine performance difference between “async” functions and generators:
import sys
import time
def binary(n):
if n <= 0:
return 1
l = yield from binary(n - 1)
r = yield from binary(n - 1)
return l + 1 + r
async def abinary(n):
if n <= 0:
return 1
l = await abinary(n - 1)
r = await abinary(n - 1)
return l + 1 + r
def timeit(func, depth, repeat):
t0 = time.time()
for _ in range(repeat):
o = func(depth)
try:
while True:
o.send(None)
except StopIteration:
pass
t1 = time.time()
print('{}({}) * {}: total {:.3f}s'.format(
func.__name__, depth, repeat, t1-t0))
The result is that there is no observable performance difference:
binary(19) * 30: total 53.321s
abinary(19) * 30: total 55.073s
binary(19) * 30: total 53.361s
abinary(19) * 30: total 51.360s
binary(19) * 30: total 49.438s
abinary(19) * 30: total 51.047s
Note that depth of 19 means 1,048,575 calls.
Reference Implementation
The reference implementation can be found here: [3].
List of high-level changes and new protocols
- New syntax for defining coroutines:
async def
and newawait
keyword. - New
__await__
method for Future-like objects, and newtp_as_async.am_await
slot inPyTypeObject
. - New syntax for asynchronous context managers:
async with
. And associated protocol with__aenter__
and__aexit__
methods. - New syntax for asynchronous iteration:
async for
. And associated protocol with__aiter__
,__aexit__
and new built- in exceptionStopAsyncIteration
. Newtp_as_async.am_aiter
andtp_as_async.am_anext
slots inPyTypeObject
. - New AST nodes:
AsyncFunctionDef
,AsyncFor
,AsyncWith
,Await
. - New functions:
sys.set_coroutine_wrapper(callback)
,sys.get_coroutine_wrapper()
,types.coroutine(gen)
,inspect.iscoroutinefunction(func)
,inspect.iscoroutine(obj)
,inspect.isawaitable(obj)
,inspect.getcoroutinestate(coro)
, andinspect.getcoroutinelocals(coro)
. - New
CO_COROUTINE
andCO_ITERABLE_COROUTINE
bit flags for code objects. - New ABCs:
collections.abc.Awaitable
,collections.abc.Coroutine
,collections.abc.AsyncIterable
, andcollections.abc.AsyncIterator
. - C API changes: new
PyCoro_Type
(exposed to Python astypes.CoroutineType
) andPyCoroObject
.PyCoro_CheckExact(*o)
to test ifo
is a native coroutine.
While the list of changes and new things is not short, it is important
to understand, that most users will not use these features directly.
It is intended to be used in frameworks and libraries to provide users
with convenient to use and unambiguous APIs with async def
,
await
, async for
and async with
syntax.
Working example
All concepts proposed in this PEP are implemented [3] and can be tested.
import asyncio
async def echo_server():
print('Serving on localhost:8000')
await asyncio.start_server(handle_connection,
'localhost', 8000)
async def handle_connection(reader, writer):
print('New connection...')
while True:
data = await reader.read(8192)
if not data:
break
print('Sending {:.10}... back'.format(repr(data)))
writer.write(data)
loop = asyncio.get_event_loop()
loop.run_until_complete(echo_server())
try:
loop.run_forever()
finally:
loop.close()
Acceptance
Implementation
The implementation is tracked in issue 24017 [15]. It was committed on May 11, 2015.
References
- [1]
- https://docs.python.org/3/library/asyncio-task.html#asyncio.coroutine
- [2] (1, 2)
- http://wiki.ecmascript.org/doku.php?id=strawman:async_functions
- [3] (1, 2)
- https://github.com/1st1/cpython/tree/await
- [4]
- https://hg.python.org/benchmarks
- [5] (1, 2)
- https://msdn.microsoft.com/en-us/library/hh191443.aspx
- [6] (1, 2)
- http://docs.hhvm.com/manual/en/hack.async.php
- [7] (1, 2)
- https://www.dartlang.org/articles/await-async/
- [8] (1, 2)
- http://docs.scala-lang.org/sips/pending/async.html
- [9]
- https://github.com/google/traceur-compiler/wiki/LanguageFeatures#async-functions-experimental
- [10] (1, 2)
- http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2013/n3722.pdf (PDF)
- [11]
- https://docs.python.org/3/reference/expressions.html#generator-iterator-methods
- [12]
- https://docs.python.org/3/reference/expressions.html#primaries
- [13]
- https://mail.python.org/pipermail/python-dev/2015-May/139851.html
- [14]
- https://mail.python.org/pipermail/python-dev/2015-May/139844.html
- [15]
- http://bugs.python.org/issue24017
- [16]
- https://github.com/python/asyncio/issues/233
- [17]
- https://hg.python.org/cpython/rev/7a0a1a4ac639
- [18]
- http://bugs.python.org/issue24400
- [19] (1, 2)
- http://bugs.python.org/issue27243
- [20] (1, 2)
- https://docs.python.org/3/reference/datamodel.html#async-iterators
Acknowledgments
I thank Guido van Rossum, Victor Stinner, Elvis Pranskevichus, Andrew Svetlov, Łukasz Langa, Greg Ewing, Stephen J. Turnbull, Jim J. Jewett, Brett Cannon, Nick Coghlan, Steven D’Aprano, Paul Moore, Nathaniel Smith, Ethan Furman, Stefan Behnel, Paul Sokolovsky, Victor Petrovykh, and many others for their feedback, ideas, edits, criticism, code reviews, and discussions around this PEP.
Copyright
This document has been placed in the public domain.
Source: https://github.com/python-discord/peps/blob/main/pep-0492.txt
Last modified: 2022-02-27 22:46:36 GMT