I have a file with over 100Million rows. When I do
pd.read_csv(filename, skiprows=100000000, iterator=True)
python crashes with a memory error. I have 32 gigs of memory and python eats up all that memory!!
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4