Open baalgar opened 4 years ago
For some reason I've stopped getting post notifications. But anyway I heard about this today. There's a new version up, and this problem should be fixed. Note that the default number of results returned is 25, if you want more, call it like
pulses = otx.search_pulses("Russian", max_results=50)
Looks like the update is working, but there's a limitation of 250 from what I can tell. The reason seems to be because paging has a limitation of 10.
Is it possible to remove all paging / max_results limitations to retrieve all related pulses? Maybe by passing an asterisk such as max_results=* ?
When searching for "Russian" through the site at, https://otx.alienvault.com/browse/pulses?q=russian, there are 288 pulse results. Below is a stacktrace of setting max_results to 288 such that pulses = otx.search_pulses("Russian", max_results=288):
Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/requests/adapters.py", line 439, in send resp = conn.urlopen( File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/urllib3/connectionpool.py", line 833, in urlopen return self.urlopen( File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/urllib3/connectionpool.py", line 833, in urlopen return self.urlopen( File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/urllib3/connectionpool.py", line 833, in urlopen return self.urlopen( [Previous line repeated 2 more times] File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/urllib3/connectionpool.py", line 819, in urlopen retries = retries.increment(method, url, response=response, _pool=self) File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/urllib3/util/retry.py", line 436, in increment raise MaxRetryError(_pool, url, error or ResponseError(cause)) urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='otx.alienvault.com', port=443): Max retries exceeded with url: /api/v1/search/pulses?q=Russian&limit=25&page=11 (Caused by ResponseError('too many 502 error responses'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/OTXv2.py", line 168, in get response = self.session().get( File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/requests/sessions.py", line 543, in get return self.request('GET', url, kwargs) File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/requests/sessions.py", line 530, in request resp = self.send(prep, send_kwargs) File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/requests/sessions.py", line 643, in send r = adapter.send(request, **kwargs) File "/Library/Frameworks/Python.framework/Versions/3.8/lib/python3.8/site-packages/requests/adapters.py", line 507, in send raise RetryError(e, request=request) requests.exceptions.RetryError: HTTPSConnectionPool(host='otx.alienvault.com', port=443): Max retries exceeded with url: /api/v1/search/pulses?q=Russian&limit=25&page=11 (Caused by ResponseError('too many 502 error responses'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "test.py", line 165, in
The search_pulses function only returns 5 results. Please add functionality to remove this limit when pulling large datasets.