Closed keithjjones closed 8 years ago
im changing your PR as it broken distributed cuckoo, so i will check but you can specify option procmemdump=no
to not process memdump, about take memdump you need configure it in cuckoo.conf
I'm sorry, I don't understand - the PR about the remote submit from this morning broke distributed cuckoo? And you're fixing it?
Also, where do you mean I can specify the procmemdump=no option? Isn't that for proc memory analysis and not for volatility analysis, too?
im testing the hotfix :)
procmemdump is the same as process volatility, I'm using my python script to submit, but you can try to submit it as this submit.py --options "procmemdump=no" hash/url
Awesome. I will try that.
Does the hotfix address not forcing a memory dump without an option, or is it just for the remote portion?
hotfix affect only then memory and enforce_timeout options
another option for you to add there probably will be interesting is submit.py --options "procmemdump=no,memory=no" hash/url
also can you try this fix? just for double verification :)
I'm trying to test it for you, but another problem (likely unrelated) popped up...
Analysis failed: The package "modules.packages.exe" start function encountered an unhandled exception: [Errno 13] Permission denied: 'C:\\1064.ini'
As soon as I figure this out I'll continue testing.
your agent.py probably not started as system?
Somehow UAC turned back on.
The changes you made did not break anything for submitting remotely, however the memory dump and analysis are still taking place for remote submission but not local submissions. I didn't try those options above, this is a standard submit with --remote.
The following options on submit remotely did not turn off memory processing:
--options "procmemdump=no,memory=no"
well at least now api.py works for dist.py and submit.py
about options, check options in db for that task if that was correctly interpreted
They should probably just both be omitted as options, as the code is going to interpret anything other than =0 as wanting to enable the option, and the default is not to enable either.
-Brad
How do you leave it as an option that can be turned on when wanted, but not enabled by default? The usage of submit.py locally and remotely are different with that respect.
he means, if you not specified that option, it interpreted as disabled, just in case try procmemdump=0,memory=0
I did not specify any options for a remote submission and memory is being analyzed. I also gave it the option as off as you suggested above and the memory is being analyze remotely. The only way I can find to turn it off is if I hard code into the processing.conf file "no" for memory, but that prevents me from ever analyzing memory, which is not what I want either. Is there a way I can have submit.py behave the same for a remote submission and a local submission? I don't know what else I can change or if this is a bug in the submit and/or api code.
can you post the database row for that analysis id, to see all options and values ?
Are you asking for a dump of the mongo database?
no, mongo is only for web report, i asking for basic task info data, the db which you define in cuckoo.conf
Here's the row from the sqllite database for the task I submitted remotely with the options:
5|/Dirty/tmp/cuckoo-tmp/upload_NCqXo2/001AEE395F92B8C1B8150BE3BBAC9D4D6541AFDA450FA561CC29FF5F4865133F|file|0|1||||procmemdump=no,memory=no||1|1|2016-05-22 19:38:20.116128|2016-09-19 15:38:15.773882|2016-09-19 15:38:16.605671|2016-09-19 15:41:50.794761|reported|||||||||||||||||||0|1||||||
your case
|procmemdump=no,memory=no||1|1|
my db
|procmemdump=no,memory=no||t|f|
did you tried upload the task using curl and webgui api just for test?
I did not. I've only been using the submit.py script since we started this morning.
is better to use integrated /api/ then api.py, but both should work, I can't verify that for you, as i have disabled memory process in processing, but as you can see your and my db store options in different a way
You used submit.py remotely? If you disable memory processing in your processing.conf file it will allow you to do memory processing when you want to, but by default it does not do it.
I don't understand how we could have different sqllite databases. I'm running the most current "master" pull.
Using curl and integrated /api works just like submit on the command line. No memory processing by default. Can you pass arguments through Curl, to turn memory on when you want to? I've not used it much.
Given that, I'd say there is something in api.py that isn't working as expected, which is the purpose of this issue.
i don't use sqlite, y use postgres, probably that is the different of values
i never used submit.py i have my scripted uploader which already have specified everything for me :)
i don't use the option to activate memory analysis as im not interested in volatility result So probably you will need to figurate where is the issue :)
Ok. Thank you.
I found the culprit(s) in the code and fixed them. Once merged I will close this issue.
Running submit.py remotely (--remote) will always create memory dumps and analyze them with volatility (even without the --memory flag). Running submit on the local cuckoo host does not do this. This might be an issue with api.py? Submitting it through Django web does not trigger the dump and analysis either.
The only way to prevent this is to turn memory processing off completely, but then you have no option to use it through the Django web interface.