Closed GelosSnake closed 9 years ago
Do you have any repro steps? Try as I might, I can't make this happen.
I did try to reproduce and failed, I think we should close it and than maybe later open if it returns.
Sounds good.
Error just came back:
Traceback (most recent call last):
File "maltrieve.py", line 336, in
I also just got this error today. This was from a fresh install and occurred on the first and second attempt to execute the program. First quit at ~644 files and the second quit around 1300ish. The third try seemed to work fine. I looked through the code and could not find anything that looked like it could cause this error.
You can scope limit to the process level within Python
import resource resource.setrlimit(resource.RLIMIT_NOFILE, (65536, 65536))
I was having the same issues on a fresh install. I was able to remedy the issue:
$ ulimit -a open files (-n) 1024
$ ulimit -n 2048
Maltrieve has now been running for 30 minutes without an issue. Before, I was erroring out after about 5-10 minutes of running.
Can one of you who's experienced the issue test the rlimit
branch? It implements the fix suggested by @alienone with a limit of 2048 descriptors as @dray0n is using.
I am going to load the rlimit branch right now.... but as a note I received the error again. Roughly 45-minutes in.
This is so odd because the Python context manager (with
) should close the file when exiting that block. It makes me wonder if grequests is holding something open.
$ netstat -an
Hundreds of "CLOSE_WAIT"'s out there.
Been awhile... I was going through the dependencies and found out that I did not have all of the requirements met for some reason. Problem solved through a few issues on missing packages and was finally able to get all requirements met.
Open socket issue is no longer an issue.
cheers
:+1:
just got same issue reported above on fresh install today. jeff@ubuntu64:~/maltrieve$ python /home/jeff/maltrieve/maltrieve.py -l logfile.txt -p http://192.168.11.122:9090 -d /home/jeff/maltrieve/malware/ External sites see 68.40.253.188
Processing source URLs
Completed source processing
Downloading samples, check log for details
Traceback (most recent call last):
File "/home/jeff/maltrieve/maltrieve.py", line 514, in
IOError: [Errno 24] Too many open files: 'urls.json
It seems that the process is running too many simultaneous file downloads? This can be fixed with extending /etc/security/limits.conf to 5000 current is: $ ulimit -n = 1000 maybe queuing system or making sure files are closed after download?