I tried to understand the various memory allocation function and macros in Python, and found that there is probably a misunderstanding in what PyObject_NEW does. For example, PyRange_New says rangeobject *obj = PyObject_NEW(rangeobject, &PyRange_Type); if (obj == NULL) return NULL; The assumption apparently is that somebody will raise a MemoryError and return NULL when allocation fails. However, this code expands to rangeobject *obj = ((rangeobject*)PyObject_Init( (PyObject *) malloc(((&PyRange_Type)->tp_basicsize)), (&PyRange_Type))); if (obj == ((void *)0) ) return ((void *)0) ; malloc will just return NULL in case of failure, and PyObject_Init starts with if (op == NULL) { PyErr_SetString(PyExc_SystemError, "NULL object passed to PyObject_Init"); return op; } So instead of a MemoryError, you will get a SystemError if the system runs out of memory. Is that intentional? The documentation says Macro version of \cfunction{PyObject_New()}, to gain performance at the expense of safety. This does not check \var{type} for a \NULL{} value. This is incorrect: It does check for NULL. It also does not help to gain performance - PyObject_New has three calls (_PyObject_New, malloc, _Py_NewReference), and so does PyObject_NEW (malloc, PyObject_Init, _Py_NewReference). I recommend to deprecate PyObject_NEW (and correspondingly PyObject_NEW_VAR, PyObject_DEL). Regards, Martin
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4