infobyte / faraday

Open Source Vulnerability Management Platform
https://www.faradaysec.com
GNU General Public License v3.0
4.72k stars 875 forks source link

error when upload template or run commands #494

Closed abdoqaidhaidar closed 1 month ago

abdoqaidhaidar commented 1 month ago

error when upload report in any extension for burp or nmap or nikto or any tool the response: An error occurred while process report was running the request to this url http://localhost:5985/_api/v3/ws/mhsinj/upload_report

onther error when run tools Faraday> tool run "nmap 127.0.0.1" No active Workspace Faraday> workspace select mhsinj EXCEPTION of type 'Exception' occurred with message: Unknown error: <class 'Exception'> - Unknown error: <class 'TypeError'> - string indices must be integers, not 'str' Faraday>

ezk06eer commented 1 month ago

hello, please share your version of faraday-cli, do: pip freeze | grep -i fara

in case the version is not 2.1.11 please upgrade. https://pypi.org/project/faraday-cli/

also please try to share the logs in (/home/faraday/.faraday/logs), keep the format in the template before submiting the issue.

Cheers!

abdoqaidhaidar commented 1 month ago

thanke you for reply!

faraday-agent-dispatcher==3.2.1 faraday-agent-parameters-types==1.5.1 faraday-cli==2.1.8 faraday-plugins==1.17.0 faradaysec==5.2.2

this is the results

and the result of log is empty

abdoqaidhaidar commented 1 month ago

and the all formats dont accepted for example cat nmap.xml
<?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE nmaprun>

this is the header of output in nmap of vulnerabilities and csv or json don't accepted Screenshot from 2024-05-28 15-07-56

abdoqaidhaidar commented 1 month ago

logs when upload file ==> audit.log <== return self.impl.get(instancestate(instance), dict) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/faraday/vendor/sqlalchemy/orm/attributes.py", line 725, in get value = state._load_expired(state, passive) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/faraday/vendor/sqlalchemy/orm/state.py", line 652, in _load_expired self.manager.deferred_scalar_loader(self, toload) File "/usr/lib/python3/dist-packages/faraday/vendor/sqlalchemy/orm/loading.py", line 942, in load_scalar_attributes raise orm_exc.DetachedInstanceError( sqlalchemy.orm.exc.DetachedInstanceError: Instance <Command at 0x7fd5dad0afd0> is not bound to a Session; attribute refresh operation cannot proceed (Background on this error at: http://sqlalche.me/e/13/bhk3)

==> faraday-dispatcher.log <==

==> faraday-server.log <== ^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/celery/app/base.py", line 795, in send_task with P.connection._reraise_as_library_errors(): File "/usr/lib/python3.11/contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "/usr/lib/python3/dist-packages/kombu/connection.py", line 476, in _reraise_as_library_errors raise ConnectionError(str(exc)) from exc kombu.exceptions.OperationalError: Error 111 connecting to 127.0.0.1:6379. Connection refused. 2024-05-28T15:16:38-0400 - geventwebsocket.handler - INFO {Dummy-11875} [pid:23325] [handler.py:242 - log_request()] 127.0.0.1 - - [2024-05-28 15:16:38] "POST /_api/v3/ws/mhsinj/upload_report HTTP/1.1" 500 192 19.634543 2024-05-28T15:16:41-0400 - geventwebsocket.handler - INFO {Dummy-11875} [pid:23325] [handler.py:242 - log_request()] 127.0.0.1 - - [2024-05-28 15:16:41] "GET /_api/v3/ws/mhsinj/vulns/filter?q=%7B%22offset%22:0,%22limit%22:50,%22order_by%22:%5B%7B%22field%22:%22confirmed%22,%22direction%22:%22desc%22%7D,%7B%22field%22:%22severity%22,%22direction%22:%22desc%22%7D%5D%7D HTTP/1.1" 200 14095 0.596449

abdoqaidhaidar commented 1 month ago

return self.impl.get(instancestate(instance), dict) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/faraday/vendor/sqlalchemy/orm/attributes.py", line 725, in get value = state._load_expired(state, passive) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/faraday/vendor/sqlalchemy/orm/state.py", line 652, in _load_expired self.manager.deferred_scalar_loader(self, toload) File "/usr/lib/python3/dist-packages/faraday/vendor/sqlalchemy/orm/loading.py", line 942, in load_scalar_attributes raise orm_exc.DetachedInstanceError( sqlalchemy.orm.exc.DetachedInstanceError: Instance <Command at 0x7fd5dad0afd0> is not bound to a Session; attribute refresh operation cannot proceed (Background on this error at: http://sqlalche.me/e/13/bhk3)

==> faraday-dispatcher.log <==

==> faraday-server.log <== return app.send_task( ^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/celery/app/base.py", line 795, in send_task with P.connection._reraise_as_library_errors(): File "/usr/lib/python3.11/contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "/usr/lib/python3/dist-packages/kombu/connection.py", line 476, in _reraise_as_library_errors raise ConnectionError(str(exc)) from exc kombu.exceptions.OperationalError: Error 111 connecting to 127.0.0.1:6379. Connection refused. 2024-05-28T15:19:15-0400 - geventwebsocket.handler - INFO {Dummy-11876} [pid:23325] [handler.py:242 - log_request()] 127.0.0.1 - - [2024-05-28 15:19:15] "POST /_api/v3/ws/mhsinj/upload_report HTTP/1.1" 500 192 19.633376

abdoqaidhaidar commented 1 month ago

other logs redis.exceptions.ConnectionError: Error 111 connecting to 127.0.0.1:6379. Connection refused.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/usr/lib/python3/dist-packages/faraday/server/api/modules/upload_reports.py", line 128, in file_upload pre_process_report_task.delay( File "/usr/lib/python3/dist-packages/celery/app/task.py", line 444, in delay return self.apply_async(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/celery/app/task.py", line 594, in apply_async return app.send_task( ^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/celery/app/base.py", line 795, in send_task with P.connection._reraise_as_library_errors(): File "/usr/lib/python3.11/contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "/usr/lib/python3/dist-packages/kombu/connection.py", line 476, in _reraise_as_library_errors raise ConnectionError(str(exc)) from exc kombu.exceptions.OperationalError: Error 111 connecting to 127.0.0.1:6379. Connection refused. 2024-05-28T15:16:38-0400 - geventwebsocket.handler - INFO {Dummy-11875} [pid:23325] [handler.py:242 - log_request()] 127.0.0.1 - - [2024-05-28 15:16:38] "POST /_api/v3/ws/mhsinj/upload_report HTTP/1.1" 500 192 19.634543 2024-05-28T15:16:41-0400 - geventwebsocket.handler - INFO {Dummy-11875} [pid:23325] [handler.py:242 - log_request()] 127.0.0.1 - - [2024-05-28 15:16:41] "GET /_api/v3/ws/mhsinj/vulns/filter?q=%7B%22offset%22:0,%22limit%22:50,%22order_by%22:%5B%7B%22field%22:%22confirmed%22,%22direction%22:%22desc%22%7D,%7B%22field%22:%22severity%22,%22direction%22:%22desc%22%7D%5D%7D HTTP/1.1" 200 14095 0.596449 2024-05-28T15:18:45-0400 - geventwebsocket.handler - INFO {Dummy-11876} [pid:23325] [handler.py:242 - log_request()] 127.0.0.1 - - [2024-05-28 15:18:45] "GET /_api/v3/vulnerability_template HTTP/1.1" 200 14918 0.181799 2024-05-28T15:18:56-0400 - geventwebsocket.handler - INFO {Dummy-11876} [pid:23325] [handler.py:242 - log_request()] 127.0.0.1 - - [2024-05-28 15:18:56] "GET /_api/session HTTP/1.1" 200 3810 0.064018 2024-05-28T15:18:56-0400 - faraday.server.api.modules.upload_reports - INFO {Dummy-11876} [pid:23325] [upload_reports.py:73 - file_upload()] Importing new plugin report in server... 2024-05-28T15:18:56-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (0/20) now. 2024-05-28T15:18:56-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (1/20) in 1.00 second. 2024-05-28T15:18:57-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (2/20) in 1.00 second. 2024-05-28T15:18:58-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (3/20) in 1.00 second. 2024-05-28T15:18:59-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (4/20) in 1.00 second. 2024-05-28T15:19:00-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (5/20) in 1.00 second. 2024-05-28T15:19:01-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (6/20) in 1.00 second. 2024-05-28T15:19:02-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (7/20) in 1.00 second. 2024-05-28T15:19:03-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (8/20) in 1.00 second. 2024-05-28T15:19:04-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (9/20) in 1.00 second. 2024-05-28T15:19:05-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (10/20) in 1.00 second. 2024-05-28T15:19:06-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (11/20) in 1.00 second. 2024-05-28T15:19:07-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (12/20) in 1.00 second. 2024-05-28T15:19:08-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (13/20) in 1.00 second. 2024-05-28T15:19:09-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (14/20) in 1.00 second. 2024-05-28T15:19:10-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (15/20) in 1.00 second. 2024-05-28T15:19:11-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (16/20) in 1.00 second. 2024-05-28T15:19:12-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (17/20) in 1.00 second. 2024-05-28T15:19:13-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (18/20) in 1.00 second. 2024-05-28T15:19:14-0400 - celery.backends.redis - ERROR {Dummy-11876} [pid:23325] [redis.py:391 - on_connection_error()] Connection to Redis lost: Retry (19/20) in 1.00 second. 2024-05-28T15:19:15-0400 - celery.backends.redis - CRITICAL {Dummy-11876} [pid:23325] [redis.py:132 - reconnect_on_error()]
Retry limit exceeded while trying to reconnect to the Celery redis result store backend. The Celery application must be restarted.

2024-05-28T15:19:15-0400 - faraday.server.api.modules.upload_reports - ERROR {Dummy-11876} [pid:23325] [upload_reports.py:141 - file_upload()] An error occurred while process report was running %s Traceback (most recent call last): File "/usr/lib/python3/dist-packages/redis/connection.py", line 611, in connect sock = self.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/redis/retry.py", line 46, in call_with_retry return do() ^^^^ File "/usr/lib/python3/dist-packages/redis/connection.py", line 612, in lambda: self._connect(), lambda error: self.disconnect(error) ^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/redis/connection.py", line 677, in _connect raise err File "/usr/lib/python3/dist-packages/redis/connection.py", line 665, in _connect sock.connect(socket_address) File "/usr/lib/python3/dist-packages/gevent/_socketcommon.py", line 590, in connect self._internal_connect(address) File "/usr/lib/python3/dist-packages/gevent/_socketcommon.py", line 634, in _internal_connect raise _SocketError(err, strerror(err)) ConnectionRefusedError: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/usr/lib/python3/dist-packages/celery/backends/redis.py", line 127, in reconnect_on_error yield File "/usr/lib/python3/dist-packages/celery/backends/redis.py", line 177, in _consume_from self._pubsub.subscribe(key) File "/usr/lib/python3/dist-packages/redis/client.py", line 1592, in subscribe ret_val = self.execute_command("SUBSCRIBE", *new_channels.keys()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/redis/client.py", line 1433, in execute_command self.connection = self.connection_pool.get_connection( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/redis/connection.py", line 1387, in get_connection connection.connect() File "/usr/lib/python3/dist-packages/redis/connection.py", line 617, in connect raise ConnectionError(self._error_message(e)) redis.exceptions.ConnectionError: Error 111 connecting to 127.0.0.1:6379. Connection refused.

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/usr/lib/python3/dist-packages/redis/connection.py", line 611, in connect sock = self.retry.call_with_retry( ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/redis/retry.py", line 46, in call_with_retry return do() ^^^^ File "/usr/lib/python3/dist-packages/redis/connection.py", line 612, in lambda: self._connect(), lambda error: self.disconnect(error) ^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/redis/connection.py", line 677, in _connect raise err File "/usr/lib/python3/dist-packages/redis/connection.py", line 665, in _connect sock.connect(socket_address) File "/usr/lib/python3/dist-packages/gevent/_socketcommon.py", line 590, in connect self._internal_connect(address) File "/usr/lib/python3/dist-packages/gevent/_socketcommon.py", line 634, in _internal_connect raise _SocketError(err, strerror(err)) ConnectionRefusedError: [Errno 111] Connection refused

During handling of the above exception, another exception occurred:

Traceback (most recent call last): File "/usr/lib/python3/dist-packages/kombu/connection.py", line 472, in _reraise_as_library_errors yield File "/usr/lib/python3/dist-packages/celery/app/base.py", line 797, in send_task self.backend.on_task_call(P, task_id) File "/usr/lib/python3/dist-packages/celery/backends/redis.py", line 373, in on_task_call self.result_consumer.consume_from(task_id) File "/usr/lib/python3/dist-packages/celery/backends/redis.py", line 169, in consume_from return self.start(task_id) ^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/celery/backends/redis.py", line 147, in start self._consume_from(initial_task_id) File "/usr/lib/python3/dist-packages/celery/backends/redis.py", line 176, in _consume_from with self.reconnect_on_error(): File "/usr/lib/python3.11/contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "/usr/lib/python3/dist-packages/celery/backends/redis.py", line 130, in reconnect_on_error self._ensure(self._reconnect_pubsub, ()) File "/usr/lib/python3/dist-packages/celery/backends/redis.py", line 384, in ensure return retry_over_time( ^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/kombu/utils/functional.py", line 318, in retry_over_time return fun(*args, *kwargs) ^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/celery/backends/redis.py", line 106, in _reconnect_pubsub metas = self.backend.client.mget(self.subscribed_to) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/redis/commands/core.py", line 1893, in mget return self.execute_command("MGET", args, options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/redis/client.py", line 1235, in execute_command conn = self.connection or pool.get_connection(command_name, options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/redis/connection.py", line 1387, in get_connection connection.connect() File "/usr/lib/python3/dist-packages/redis/connection.py", line 617, in connect raise ConnectionError(self._error_message(e)) redis.exceptions.ConnectionError: Error 111 connecting to 127.0.0.1:6379. Connection refused.

The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/usr/lib/python3/dist-packages/faraday/server/api/modules/upload_reports.py", line 128, in file_upload pre_process_report_task.delay( File "/usr/lib/python3/dist-packages/celery/app/task.py", line 444, in delay return self.apply_async(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/celery/app/task.py", line 594, in apply_async return app.send_task( ^^^^^^^^^^^^^^ File "/usr/lib/python3/dist-packages/celery/app/base.py", line 795, in send_task with P.connection._reraise_as_library_errors(): File "/usr/lib/python3.11/contextlib.py", line 158, in exit self.gen.throw(typ, value, traceback) File "/usr/lib/python3/dist-packages/kombu/connection.py", line 476, in _reraise_as_library_errors raise ConnectionError(str(exc)) from exc kombu.exceptions.OperationalError: Error 111 connecting to 127.0.0.1:6379. Connection refused. 2024-05-28T15:19:15-0400 - geventwebsocket.handler - INFO {Dummy-11876} [pid:23325] [handler.py:242 - log_request()] 127.0.0.1 - - [2024-05-28 15:19:15] "POST /_api/v3/ws/mhsinj/upload_report HTTP/1.1" 500 192 19.633376 2024-05-28T15:19:18-0400 - geventwebsocket.handler - INFO {Dummy-11876} [pid:23325] [handler.py:242 - log_request()] 127.0.0.1 - - [2024-05-28 15:19:18] "GET /_api/v3/ws/mhsinj/vulns/filter?q=%7B%22offset%22:0,%22limit%22:50,%22order_by%22:%5B%7B%22field%22:%22confirmed%22,%22direction%22:%22desc%22%7D,%7B%22field%22:%22severity%22,%22direction%22:%22desc%22%7D%5D%7D HTTP/1.1" 200 14095 0.386380

abdoqaidhaidar commented 1 month ago

hey mother fu**r why you don't answer

kar33m01 commented 1 month ago

not like this way

kar33m01 commented 1 month ago

abdo qaid haidar u lang is bad .... so keep a limitation

abdoqaidhaidar commented 1 month ago

STFU

ezk06eer commented 1 month ago

thanke you for reply!

faraday-agent-dispatcher==3.2.1 faraday-agent-parameters-types==1.5.1 faraday-cli==2.1.8 faraday-plugins==1.17.0 faradaysec==5.2.2

this is the results

and the result of log is empty

Hi @abdoqaidhaidar, lets keep it more professional, and with the highest of the respect to all of the contributors.

your faraday-cli version is outdated, plase uninstall it and reinstall.

pip install --upgrade --force-reinstall faraday-cli

docs: Faraday-Cli

Also, try please sharing all the info you may have about the implementation. Are you using kubernetes/docker? are you running your database locally?

Keep in mind we do have some template bug report when you create an issue, chose wisely and follow the template as it is.

It seems you've not installed redis or it is shutdown.

requirements are: redis postgresql +9

Template :

Please search the [Wiki](https://github.com/infobyte/faraday/wiki) for a solution before posting a ticket. Use the <strong>“New Support Request”</strong> button to the right of the screen to submit a ticket for technical support.

## Issue Type
 - Bug Re port
 - Feature Idea
 - Documentation Report

## Faraday version

Paste the output of the *./faraday.py --version* command

## Component Name

If you know where the problem lays indicate it:
WebGui/GTKGui/Plugin/Console/Continuous Scanning/Etc.

## Steps to reproduce

Provide detailed steps on how the issue happened so we can try to reproduce it. If the issue is random, please provide as much information as possible.

## Expected results

What did you expect to happen when following the steps above?

### Debugging tracebacks (current results)

Try to reproduce the bug with the server and/or gtk client in debug mode and check the logs for the ERROR string.
Add here any errors you find while running in debug mode or, if possible, Faraday’s log files (located at *$HOME/.faraday/logs/*).

If you need help on how to execute in debug mode [click here for more information](https://github.com/infobyte/faraday/wiki/troubleshooting).

Please attach the result of:

pip freeze > requirements_freeze.txt

### Screenshots

If you don't find anything on the logs, please provide screenshots of the error.

## Environment information

### Configuration files

Mention any settings you have changed/added/removed.

### Reports/Extra data

If you are having issues with plugins, please attach relevant files if possible.
(strip your reports of all sensitive information beforehand).

### OS

Provide information on your operating system. Example:

$ cat /etc/lsb-release
DISTRIB_ID=Ubuntu
DISTRIB_RELEASE=16.10
DISTRIB_CODENAME=yakkety
DISTRIB_DESCRIPTION="Ubuntu 16.10"

If you want to check the most easy way to install faraday to try it locally is docker-compose up, right after downloading the source.

if you will reinstall faraday somehow, do a fresh install and dont forget to delete the .faraday folder in your $HOME.

Have an awesome day.