On Feb 16, 2008 3:12 PM, Amaury Forgeot d'Arc <amauryfa at gmail.com> wrote: > Should we however intensively search and correct all of them? > Is there a clever way to prevent these problems globally, for example > by delaying finalizers "just a little"? A simple way to do this would be to push objects whose refcounts had reached 0 onto a list instead of finalizing them immediately, and have PyEval_EvalFrameEx periodically swap in a new to-delete list and delete the objects on the old one. A linked list would cost an extra pointer in PyObject_HEAD, but a growable array would only cost allocations, which would be amortized over the allocations of the objects you're deleting, so that's probably the way to go. A fixed-size queue that just delayed finalization by a constant number of objects would usually work without any allocations, but there would sometimes be single finalizers that recursively freed too many other objects, which would defeat the delay. Jeffrey
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4