David Abrahams wrote: >> sourceforge is known to have reliability problems when downloading >> large files. here's a wget session on my machine (okay, it's not a silent >> truncation, but SF is clearly not well): > > Okay, that tells me something... but shouldn't urllib throw an > exception in case of a problem... or shouldn't it do something to > retry? from the HTTP speification: HTTP/1.1 user agents MUST notify the user when an invalid length is received and detected. is urllib a user agent? or is that better left to your application? file, headers = urllib.urlretrieve(...) if os.path.getsize(file) != int(headers.get("content-length", 0)): print "oops!" </F>
RetroSearch is an open source project built by @garambo | Open a GitHub Issue
Search and Browse the WWW like it's 1997 | Search results from DuckDuckGo
HTML:
3.2
| Encoding:
UTF-8
| Version:
0.7.4