You don't define what you mean by "a lot". Python can read a tremendous amount of information from files in a very short amount of time so I wouldn't try to prematurely optimize this. Just read the information and see how long it takes. If it is really a long time, then look for alternatives. It will take some time to read the dictionary and to unpickle it. -Larry Enrique Palomo Jiménez wrote: > Hi all, > > I'm writing an application who needs to handle a lot of information of several files. > So, i think the better way is design a batch process to catch that information in a dictionary and write it in a file. > So, after that when a user wants to retrieve something, only with an execfile i'll have all the information avaiable. > > Creating the file with pythonic syntax will be a hard job so, > > is there some module to do that? > is there a better way? > > The information will be used no more than 3-4 days a month and install databases is not allowed. > > Thanks
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4