Martin v. Löwis wrote: >> For binary representations, we already have the struct module to handle >> the parsing, but for byte sequences with embedded ASCII digits it's >> reasonably common practice to use strings along with the respective type >> constructors. > > Sure, but why can't you write > > foo = int(bar[start:stop].decode("ascii")) > > then? Explicit is better than implicit. Yeah, this thread has convinced me that it would be better to start rejecting bytes in int() and float() as well rather than implicitly assuming an ASCII encoding. If we decide the fast path for ASCII is still important (e.g. to solve 3.0's current speed problems in decimal), then it would be better to add separate methods to int to expose the old 2.x str->int and int->str optimisations (e.g. an int.from_ascii class method and an int.to_ascii instance method). Cheers, Nick. -- Nick Coghlan | ncoghlan at gmail.com | Brisbane, Australia --------------------------------------------------------------- http://www.boredomandlaziness.org
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4