One small clarification: On Tue, May 5, 2015 at 12:40 PM, Jim J. Jewett <jimjjewett at gmail.com> wrote: > [...] but I don't understand how this limitation works with things like a > per-line file iterator that might need to wait for the file to > be initially opened. > Note that PEP 492 makes it syntactically impossible to use a coroutine function to implement an iterator using yield; this is because the generator machinery is needed to implement the coroutine machinery. However, the PEP allows the creation of asynchronous iterators using classes that implement __aiter__ and __anext__. Any blocking you need to do can happen in either of those. You just use `async for` to iterate over such an "asynchronous stream". (There's an issue with actually implementing an asynchronous stream mapped to a disk file, because I/O multiplexing primitives like select() don't actually support waiting for disk files -- but this is an unrelated problem, and asynchronous streams are useful to handle I/O to/from network connections, subprocesses (pipes) or local RPC connections. Checkout the streams <https://docs.python.org/3/library/asyncio-stream.html> and subprocess <https://docs.python.org/3/library/asyncio-subprocess.html> submodules of the asyncio package. These streams would be great candidates for adding __aiter__/__anext__ to support async for-loops, so the idiom for iterating over them can once again closely resemble the idiom for iterating over regular (synchronous) streams using for-loops.) -- --Guido van Rossum (python.org/~guido) -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://mail.python.org/pipermail/python-dev/attachments/20150505/faff9118/attachment-0001.html>
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4