[Andreas] > I find the following two operations functions useful and general > enough that I would like to propose them for addition to itertools: No thanks. Variants can already be constructed from existing tools. And, they seem a little to specific to a data model where the first entry has some special significance depending on whether or not it is unique. > it = iter(iterable) > try: > first_item = it.next() > except StopIteration: > raise ValueError, "empty iterable passed to 'single_valued()'" > for other_item in it: > if other_item != first_item: > raise ValueError, "non-single-valued iterable'" > return first_item This looks like a set() operation with a couple odd special cases for exceptions. [] --> ValueError If [x] --> x [x x x] --> x [x x y x] --> ValueError The two non-exception cases both run the input iterable to exhaustion and as such do not fit it with the lazy-evaluation theme of the itertools module. > def one(iterable): > it = iter(iterable) > try: > v = it.next() > except StopIteration: > raise ValueError, "empty iterable passed to 'one()'" > try: > v2 = it.next() > raise ValueError, "iterable with more than one entry passed to 'one()'" > except StopIteration: > return v Looks similar to list(islice(iterable,2)) followed by regular list-like handling. > what_i_am_looking_for = one(item for item in items if predicate(item)) Looks similar to: wialf = ifilter(pred, items).next() > This also encodes and checks the assumption that the sought item is unique > within the list of candidates. Again, the assertion part could be turned off > in optimized mode. That is an odd assumption given that you're searching for a predicate match and not a single item match. Also, it is often a better design to enforce uniqueness constraints upon insertion, not upon lookup. Raymond
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4