Guido van Rossum wrote: >>>I'm not very concerned about strings or lists with more than 2GB >>>items, but I am concerned about other memory buffers. >> >>Those in the Numeric/numarray community, for one, would also be >>concerned. Although there aren't many data arrays these days that are >>larger than 2GB there are some beginning to appear. I have no doubt >>that within a few years there will be many more. I'm not sure I >>understand all the implications of the discussion here, but it sounds >>like an important issue. Currently strings are frequently used as >>a common "medium" to pass binary data from one module to another >>(e.g., from Numeric to PIL); limiting strings to 2GB may prove >>a problem in this area (though frankly, I suspect few will want >>to use them as temporary buffers for objects that size until memories >>have grown a bit more :-). > > > Sorry, I should have been more exact. I meant 2 billion items, not 2 > gigabytes. That should give you an extra factor 4-8 to play with. :-) > > We'll fix this in Python 3.0 for sure -- the question is, should we > start fixing it now and binary compatibility be damned, or should we > honor binary compatiblity more? What binary compatibility ? I thought we had given that idea up after 1.5.2 was out the door (which is also why the Windows distutils installers are very picky about the Python version to install an extension for). -- Marc-Andre Lemburg CEO eGenix.com Software GmbH ______________________________________________________________________ Company & Consulting: http://www.egenix.com/ Python Software: http://www.egenix.com/files/python/ Meet us at EuroPython 2002: http://www.europython.org/
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4