> > > Also, I have a question about the semantic specification of what a copy > > > is supposed to do. Does it guarantee that the same data stream will be > > > reproduced? For instance, would a generator of random words expect its > > > copy to generate the same word sequence. Or, would a copy of a > > > dictionary iterator change its output if the underlying dictionary got > > > updated (i.e. should the dict be frozen to changes when a copy exists or > > > should it mutate). > > > > Every attempt should be made for the two copies to return exactly the > > same stream of values. This is the pure tee() semantics. > > Yes, but iterators that run on underlying containers don't guarantee, > in general, what happens if the container is mutated while the iteration > is going on -- arbitrary items may end up being skipped, repeated, etc. > So, "every attempt" is, I feel, too strong here. Maybe. I agree that for list and dict iterators, if the list is mutated, this warrantee shall be void. But I strongly believe that cloning a random iterator should cause two identical streams of numbers, not two different random streams. If you want two random streams you should create two independent iterators. Most random number generators have a sufficiently small amount of state that making a copy isn't a big deal. If it is hooked up to an external source (e.g. /dev/random) then I'd say you'd have to treat it as a file, and introduce explicit buffering. > deepcopy exists for those cases where one is ready to pay a hefty > price for guarantees of "decoupling", after all. But I don't propose that iterators support __deepcopy__. The use case is very different. --Guido van Rossum (home page: http://www.python.org/~guido/)
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4