Python 2.7 Ioerror: [errno 24] Too Many Open Files:
At work I had the bad luck to have fix a badly written url validator script in python done by someone else. It's a really messy code, and trying to fix one of the bugs, I found som
Solution 1:
With liNew = "http://" + li, file descriptors don't go over the default limit of 1024, but changing that line to liNew = li will make them go over 8000, why ??
- before: broken url - nothing gets downloaded (no files are opened)
- after: correct url - urls are saved to files (there are 10K urls)
It probably doesn't make sense to download more that a few hundreds urls concurrently (bandwidth, disk). Make sure that all files (sockets, disk files) are properly disposed after the download (close()
method is called in time).
Default limit (1024) is low but don't increase it unless you understand what the code does.
Post a Comment for "Python 2.7 Ioerror: [errno 24] Too Many Open Files:"