> On Friday, Nov 22, 2002, at 09:46 Europe/Amsterdam, Raymond Hettinger > wrote: > > def outputCaps(logfile): > > while True: > > line = __self__.data > > logfile.write(line.upper) > > yield None > > outputCaps.data = "" # optional attribute initialization > > > > g = outputCaps(open('logfil.txt','w')) > > for line in open('myfile.txt'): > > g.data = line > > g.next() [Jack] > I don't like it, there's "Magic! Magic!" written all over it. > Generators have always given me that feeling (you start reading them as > a function, then 20 lines down you meet a "yield" and suddenly realize > you have to start reading at the top again, keeping in mind that this > is a persistent stack frame), but with the __self__ plus the fact that > you local variables may not be what they appear to be makes it hairy. > You basically cannot understand the code without knowing the code of > the caller. There's also absolutely no way to get encapsulation. So > count me in for a -1. Ditto here. The PEP is way too thin on rationale. It has some examples but doesn't explain why this is better than what you woul ddo in current Python. Since passing a simple objects as an extra argument to the generator is all that's needed to pass extra values into a generator between next() calls, I don't see the advantage. And __self__ is butt-ugly. > Generators have to me always felt more "class-instance-like" than > "function-like", and I guess this just goes to show it. --Guido van Rossum (home page: http://www.python.org/~guido/)
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4