Evan Jones wrote: > That is correct. If you look at the implementation for lists, it keeps a > maximum of 80 free lists around, and immediately frees the memory for > the containing array. Again, this seems like it is sub-optimal to me: In > some cases, if a program uses a lot of lists, 80 lists may not be > enough. For others, 80 may be too much. It seems to me that a more > dynamic allocation policy could be more efficient. I knew this discussion sounded familiar. . . http://mail.python.org/pipermail/python-dev/2004-June/045403.html (and assorted replies) I'm not saying I *like* the unbounded lists. . . but there's a reason they're still like that (i.e. getting the memory usage down tends to take some of Python's speed with it - and there isn't exactly a lot of that to be spared!). Still, fresh eyes on the problem may see something new :) Cheers, Nick.
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4