Open SwissKid opened 10 years ago
Lolz,
On Wed, Jan 31, 2018 at 6:34 AM, netpacket notifications@github.com wrote:
@doomedraven https://github.com/doomedraven I totally brainfarted. Oops, the tasks on different db than mongodb is for web stuff.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/cuckoosandbox/cuckoo/issues/358#issuecomment-361671442, or mute the thread https://github.com/notifications/unsubscribe-auth/AQ_imM2otiw3K4C7bOVZjHxl-E3bwXIhks5tP1KlgaJpZM4CRnyK .
SparkyNZL, Lol indeed. I migrated off the default sql db to psq db.
Hmm are you talking about the upload of large files ? to cuckoo ? over the WebUI ? there are a number of places which might trip you up,
in fhe conf files there are file size limits, make sure these are correct.
if the logging and output from the files it too big it will not be able to be ingested into mongodb causing a 404 error in the webui, to address this most of the time it is the memory dumps (process) which are blowing these BSON sizes out, and if you are submitting a big file and doing a memory dumb of it , it will fail to import this information into mongodb.
You should see an error if you are running the processing in debug mode, read the errors they are actually really helpful !
Cheers and I hope this helpful
On Wed, Jan 31, 2018 at 6:49 AM, netpacket notifications@github.com wrote:
I did. Tasks are on SQlite by default and webgui on mongodb. The issue lies on the mongodb with upload size. I think I get it. I upgraded the SQLite db to postgresql db. Thanks.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/cuckoosandbox/cuckoo/issues/358#issuecomment-361676167, or mute the thread https://github.com/notifications/unsubscribe-auth/AQ_imOjccl7TjCcuZq7Q9FSTQbCQiqE1ks5tP1YrgaJpZM4CRnyK .
yeah, it does it job if you only have one vm ! which is really good if you have a IR box running it.
On Wed, Jan 31, 2018 at 7:52 AM, netpacket notifications@github.com wrote:
SparkyNZL, Lol indeed. I migrated off the default sql db to psq db.
— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/cuckoosandbox/cuckoo/issues/358#issuecomment-361695873, or mute the thread https://github.com/notifications/unsubscribe-auth/AQ_imHqi-vQbMTf7L_6yf00358RhKZvSks5tP2UBgaJpZM4CRnyK .
SparkyNZL, I definitely run the debug component on. What you mentioned makes sense to look for bottlenecks in misconfiguration.
@wroersma We have a couple of major Cuckoo Core changes upcoming that will, among many other things, mitigate this issue. It hasn't been properly addressed yet because it takes months of hard work with a non-backwards compatible result. Please remember this is an open source project and we do our best with the resources that we have ;-) If no other questions arise I'll be closing this issue - in the end we're aware of it.
I'd be happy to help write whatever code is needed as long as it's something that will be accepted into the project. It's just hard when we don't know what you guys are working on and even if you will take the PR.
I am having mayor troubles with mongodb issues, when it will be solve?
can you elaborate your problem?
Just ran into this with Cuckoo-2.0.6. The processing module crashed with same error message and I have no idea what caused this.
I like the idea given by @KillerInstinct - get an incomplete report into mongo rather than no report at all. The only problem is that his fix is not patchable with the latest version of cuckoo.
error message and I have no idea what caused this.
if you don't post it how you expect someone to help you? ;)
@doomedraven Well cuckoo only logs the error that report is larger than 16MB. As I can't access the report through WebGUI, how can I tell what caused the report size to be larger than 16MB? Most of the times the reports get written to Mongo fine so why this particular report is too large is what I meant by I have no idea what caused this
access the server, but do searcher in issues, i just recentry posted fixed mongodb.py file which fixed that
It would be great if you could link the fixed file here so that maybe we can test it.
I did search the issues and that is how I landed on this thread. In another you say yara may be to blame but I am not using any additional yara rules.
In the server that the report.json is a whopping 370MB. My limited understanding (and mucking around in text editor with that large file) tells me either process or file count is too large but I can't confirm this.
i doubt what that will be merged, so you need to replace that file by yourself https://github.com/cuckoosandbox/cuckoo/pull/2570
Can I just ask something about this please? I get the limitation but I'm testing my setup using files from this cuckoo cert site (cuckoo.cert.ee). I download samples from here and upload them to mine. How is it that this work in Cuckoo yet mine give this MongoDB error? I clearly have something configured differently.
Can someone help please? It's almost every file I upload that errors making my instance unusable.
2014-07-28 10:23:16,041 [lib.cuckoo.core.plugins] ERROR: Failed to run the reporting module "MongoDB":
Traceback (most recent call last):
File "/home/cuckoo/cuckoo/lib/cuckoo/core/plugins.py", line 499, in process current.run(self.results)
File "/home/cuckoo/cuckoo/modules/reporting/mongodb.py", line 195, in run self.db.analysis.save(report)
File "/usr/lib/python2.7/dist-packages/pymongo/collection.py", line 228, in save return self.insert(to_save, manipulate, safe, **kwargs)
File "/usr/lib/python2.7/dist-packages/pymongo/collection.py", line 306, in insert continue_on_error, self.__uuid_subtype), safe)
File "/usr/lib/python2.7/dist-packages/pymongo/connection.py", line 732, in _send_message
(request_id, data) = self.__check_bson_size(message)
File "/usr/lib/python2.7/dist-packages/pymongo/connection.py", line 709, in __check_bson_size
(max_doc_size, self.__max_bson_size))
InvalidDocument: BSON document too large (17837322 bytes) - the connected server supports BSON document sizes up to 16777216 bytes.
Suggested Fix: Might be possible to fix using gridfs (Maybe? I just saw it in a stackexchange post) or some other limiter. Or recompiling mongodb with a patch to increase this limit.
Either way, a check should be put in place to prevent this error from occurring.