Closed baptx closed 11 months ago
👋 @baptx Issues is only for reporting a bug/feature request. For limited support, questions, and discussions, please join MobSF Slack channel Please include all the requested and relevant information when opening a bug report. Improper reports will be closed without any response.
There will be this problem because mobsf does not have a queue mechanism. I don’t know if it will be added later.
We do not have queuing support at this time. But this is something we can work on as an enhancement. If you use the mass static analysis script (https://github.com/MobSF/Mobile-Security-Framework-MobSF/blob/master/scripts/mass_static_analysis.py), It scans one file at a time and can work with low RAM situation.
@ajinabraham Good to know but I used run.sh to start the web server. I would prefer to use the web interface. It can also be an issue on public web servers like on a VPS that often have only 1 GB or 2 GB RAM.
Although MobSF has a web interface, it is designed to be hosted locally with 8+ GB of RAM. The script that I pointed still requires the MobSF web server to be running. It just scans one file after another like a queue instead of overloading RAM with multiple scans at a time.
It is recommended to make the configuration optional, and let the user decide the number of concurrent tasks according to the actual situation. After all, everyone's memory size is different.
I saw in the readme of this project that MobSF Static Analyzer is hosted online (https://mobsf.live/) so people can probably trigger a RAM overflow on this server.
mobsf.live
is a sponsored demo instance for people to quickly try out MobSF that we recycle regularly. It doesn't support large files, couple of scanner options and dynamic analysis. Most of the third party tooling we use will fail or get skipped on a box with less than 4GB RAM anyways. The intended deployment model is on a local stack with sufficient CPU and RAM resources. It is also not recommended to setup MobSF on an internet server as we do not support any authentication. Architecturally it is not designed with a scalable cloud application model in mind, but rather a desktop application available with in a Web UI.
It is recommended to make the configuration optional, and let the user decide the number of concurrent tasks according to the actual situation. After all, everyone's memory size is different.
The user could update the resources available to MobSF or scan one file at a time, based on what the resources would allow him to.
I can add an enhancement for queuing support, but not a top priority at this time.
Closing this and tracking this separately.
I thought of a lazy implementation method: use celery to implement a task queue and run N tasks at a time (the size of N is customized according to the actual situation). If more than N tasks are entered, they will enter the queue and wait for the previous tasks to be completed, and then enter the queue first to start analysis. That's pretty much it.
DjangoQ2 based scan task queue is now supported from MobSF v4.2.0 https://mobsf.github.io/docs/#/docker_options?id=configuring-asynchronous-scan-queues https://mobsf.github.io/docs/#/develop?id=asynchronous-scan-queues
This is a great update
I just read the new code and have some thoughts on it. I think it would be better if the asynchronous analysis task is placed after the Upload function https://github.com/MobSF/Mobile-Security-Framework-MobSF/blob/master/mobsf/MobSF/views/home.py#L100, so that multiple users uploading apk files for analysis at the same time will not be blocked. At the same time, it can also improve the robustness of the service and avoid being unable to analyze some confused apk causing the service to crash
You could do that today with docker compose like scale up the MobSF instance for multiple users. https://mobsf.github.io/docs/#/docker_options?id=architecture
For example:
docker compose up --scale mobsf=4
A nginx proxy does the load balancing and distribute traffic to one of the 4
instance of MobSF.
The djangoq runs 3 worker process by default , so 3 scan tasks can run at a time. These numbers can all scale up based on the CPU cores and RAM available.
Thanks for the reply, this is a good solution
It seems that the queue is stuck, it could be a bug
This looks normal to me, The queue is still processing 3 workers at a time. It should try each worker for a maximum of 1 hour before retrying one more time https://github.com/MobSF/Mobile-Security-Framework-MobSF/blob/b5da756615b355fb2179ad6493075d8981f0b70e/mobsf/MobSF/settings.py#L348-L353 and then proceeding to the next one in the queue.
This configuration is to wait for 3600 seconds to time out, and then try again. After waiting for 3700 seconds to time out, terminate the task and execute the next one. Am I right to understand this?
Is your feature request related to a problem? Please describe. When uploading multiple files at the same time on a computer with low RAM, it takes all RAM and freezes the computer. I tested with 3 APK files (each file had a size of around 90 MB) and I had 4 GB RAM. My main laptop stopped working so I am using an older laptop temporarily before switching to a new one with more RAM. I had to execute
killall -9 java
to kill jadx which was taking all RAM but also had to restart the computer because it was not responding correctly. Computers with more RAM are probably affected by the issue too, for example if you scan more than 10 large files at the same time.Describe the solution you'd like There should be an option to add files to a queue instead of scanning them at the same time. This way we can start a scan of multiple files without worrying of the RAM limit. The option should be available next to the upload button so the user will not miss it (with an explicit description saying that it is to save RAM). It should also be available as a default option that cannot be changed by unauthorized users so if someone is running the software on a public server, other people cannot overload the server by using all RAM.