I'm looking at preventing core dumps due to recursive calls. With simple nested call counters for every function in object.c, limited to 500 levels deep recursions, I think this works okay for repr, str and print. It solves most of the complaints, like: class Crasher: def __str__(self): print self print Crasher() With such protection, instead of a core dump, we'll get an exception: RuntimeError: Recursion too deep So far, so good. 500 nested calls to repr, str or print are likely to be programming bugs. Now I wonder whether it's a good idea to do the same thing for getattr and setattr, to avoid crashes like: class Crasher: def __getattr__(self, x): return self.x Crasher().bonk Solving this the same way is likely to slow things down a bit, but would prevent the crash. OTOH, in a complex object hierarchy with tons of delegation and/or lookup dispatching, 500 nested calls is probably not enough. Or am I wondering too much? Opinions? -- Vladimir MARANGOZOV | Vladimir.Marangozov@inrialpes.fr http://sirac.inrialpes.fr/~marangoz | tel:(+33-4)76615277 fax:76615252
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4