Greg, On 2015-04-29 5:12 AM, Greg Ewing wrote: > Yury Selivanov wrote: > >> It's important to at least have 'iscoroutine' -- to check that >> the object is a coroutine function. A typical use-case would be >> a web framework that lets you to bind coroutines to specific >> http methods/paths: >> >> @http.get('/spam') >> async def handle_spam(request): >> ... > > >> The other thing is that it's easy to implement this function >> for CPython: just check for CO_COROUTINE flag. > > But isn't that too restrictive? Any function that returns > an awaitable object would work in the above case. It's just an example. All in all, I think that we should have full coverage of python objects in the inspect module. There are many possible use cases besides the one that I used -- runtime introspection, reflection, debugging etc, where you might need them. > >>>> One of the most frequent mistakes that people make when using >>>> generators as coroutines is forgetting to use ``yield from``:: >> >> I think it's a mistake that a lot of beginners may make at some >> point (and in this sense it's frequent). I really doubt that >> once you were hit by it more than two times you would make it >> again. > > What about when you change an existing non-suspendable > function to make it suspendable, and have to deal with > the ripple-on effects of that? Seems to me that affects > everyone, not just beginners. I've been using coroutines on a daily basis for 6 or 7 years now, long before asyncio we had a coroutine-based framework at my firm (yield + trampoline). Neither I nor my colleagues had any problems with refactoring the code. I really try to speak from my experience when I say that it's not that big of a problem. Anyways, the PEP provides set_coroutine_wrapper which should solve the problem. > >>>> 3. ``yield from`` does not accept coroutine objects from plain Python >>>> generators (*not* generator-based coroutines.) >>>> >>> What exactly are "coroutine objects >>> from plain Python generators"?) >> >> # *Not* decorated with @coroutine >> def some_algorithm_impl(): >> yield 1 >> yield from native_coroutine() # <- this is a bug > > So what you really mean is "yield-from, when used inside > a function that doesn't have @coroutine applied to it, > will not accept a coroutine object", is that right? If > so, I think this part needs re-wording, because it sounded > like you meant something quite different. > > I'm not sure I like this -- it seems weird that applying > a decorator to a function should affect the semantics > of something *inside* the function -- especially a piece > of built-in syntax such as 'yield from'. It's similar > to the idea of replacing 'async def' with a decorator, > which you say you're against. This is for the transition period. We don't want to break existing asyncio code. But we do want coroutines to be a separate concept from generators. It doesn't make any sense to iterate through coroutines or to yield-from them. We can deprecate @coroutine decorator in 3.6 or 3.7 and at some time remove it. > > BTW, by "coroutine object", do you mean only objects > returned by an async def function, or any object having > an __await__ method? I think a lot of things would be > clearer if we could replace the term "coroutine object" > with "awaitable object" everywhere. The PEP clearly separates awaitbale from coroutine objects. - coroutine object is returned from coroutine call. - awaitable is either a coroutine object or an object with __await__. list(), tuple(), iter(), next(), for..in etc. won't work on objects with __await__ (unless they implement __iter__). The problem I was discussing is about specifically 'yield from' and 'coroutine object'. > >> ``yield from`` does not accept *native coroutine objects* >> from regular Python generators > > It's the "from" there that's confusing -- it sounds > like you're talking about where the argument to > yield-from comes from, rather than where the yield-from > expression resides. In other words, we though you were > proposing to disallow *this*: > > # *Not* decorated with @coroutine > def some_algorithm_impl(): > yield 1 > yield from iterator_implemented_by_generator() > > I hope to agree that this is a perfectly legitimate > thing to do, and should remain so? > Sure it's perfectly normal ;) I apologize for the poor wording. Yury
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4