On 10/1/2017 7:34 PM, Nathaniel Smith wrote: >> Another major slowness comes from compiling regular expression. >> I think we can increase cache size of `re.compile` and use ondemand cached >> compiling (e.g. `re.match()`), >> instead of "compile at import time" in many modules. > In principle re.compile() itself could be made lazy -- return a > regular exception object that just holds the string, and then compiles > and caches it the first time it's used. Might be tricky to do in a > backwards compatibility way if it moves detection of invalid regexes > from compile time to use time, but it could be an opt-in flag. Would be interesting to know how many of the in-module, compile time re.compile calls use dynamic values, versus string constants. Seems like string constant parameters to re.compile calls could be moved to on-first-use compiling without significant backwards incompatibility impact if there is an adequate test suite... and if there isn't an adequate test suite, should we care about the deferred detection? -------------- next part -------------- An HTML attachment was scrubbed... URL: <http://mail.python.org/pipermail/python-dev/attachments/20171001/adbc994a/attachment-0001.html>
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4