yogeshojha / rengine

reNgine is an automated reconnaissance framework for web applications with a focus on highly configurable streamlined recon process via Engines, recon data correlation and organization, continuous monitoring, backed by a database, and simple yet intuitive User Interface. reNgine makes it easy for penetration testers to gather reconnaissance with minimal configuration and with the help of reNgine's correlation, it just makes recon effortless.
https://yogeshojha.github.io/rengine/
GNU General Public License v3.0
7.47k stars 1.13k forks source link

Bug - Scan failed #247

Closed GabrielMioranza closed 3 years ago

GabrielMioranza commented 4 years ago

Issue Summary

I got a scan failed on every task

image

Steps to Reproduce

  1. When I am scanning the domains, this happens

Any other relevant information. For example, why do you consider this a bug and what did you expect to happen instead?

Technical details

Please list out any technical details such as operating environment.

Ubuntu 20;04

issue-label-bot[bot] commented 4 years ago

Issue-Label Bot is automatically applying the label bug to this issue, with a confidence of 0.97. Please mark this comment with :thumbsup: or :thumbsdown: to give our bot feedback!

Links: app homepage, dashboard and code for this bot.

ocmcc commented 3 years ago

Issue Summary

I got a scan failed on every task

image

Steps to Reproduce

  1. When I am scanning the domains, this happens

Any other relevant information. For example, why do you consider this a bug and what did you expect to happen instead?

  • I have confirmed that this issue can be reproduced as described on a latest version/pull of reNgine: (yes / no) - Yes How i can get the logs and see who happens?

Technical details

Please list out any technical details such as operating environment.

Ubuntu 20;04

met the same problem on ubuntu20.04, have u solved it?

yogeshojha commented 3 years ago

can you please attach the logs?

cd rengine make logs

ocmcc commented 3 years ago

I use the "oppo.com" as target, scan type is full scan, ubuntu 20.04 virtual machine. The logs:

COMPOSE_DOCKER_CLI_BUILD=1 docker-compose -f docker-compose.yml logs --follow --tail=1000 db web proxy redis celery celery-beat Attaching to rengine_proxy_1, rengine_celery_1, rengine_web_1, rengine_celery-beat_1, rengine_db_1, rengine_redis_1 celery-beat_1 | Waiting for postgres... celery-beat_1 | PostgreSQL started celery-beat_1 | Operations to perform: celery-beat_1 | Apply all migrations: admin, auth, contenttypes, django_celery_beat, notification, scanEngine, sessions, startScan, targetApp celery-beat_1 | Running migrations: celery-beat_1 | Traceback (most recent call last): celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute celery-beat_1 | return self.cursor.execute(sql) celery-beat_1 | psycopg2.errors.UniqueViolation: duplicate key value violates unique constraint "pg_type_typname_nsp_index" celery-beat_1 | DETAIL: Key (typname, typnamespace)=(auth_permission_id_seq, 2200) already exists. celery-beat_1 | celery-beat_1 | celery-beat_1 | The above exception was the direct cause of the following exception: celery-beat_1 | celery-beat_1 | Traceback (most recent call last): celery-beat_1 | File "manage.py", line 21, in celery-beat_1 | main() celery-beat_1 | File "manage.py", line 17, in main celery-beat_1 | execute_from_command_line(sys.argv) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/init.py", line 401, in execute_from_command_line celery-beat_1 | utility.execute() celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/init.py", line 395, in execute celery-beat_1 | self.fetch_command(subcommand).run_from_argv(self.argv) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 328, in run_from_argv celery-beat_1 | self.execute(*args, cmd_options) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 369, in execute celery-beat_1 | output = self.handle(*args, *options) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 83, in wrapped celery-beat_1 | res = handle_func(args, kwargs) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/commands/migrate.py", line 231, in handle celery-beat_1 | post_migrate_state = executor.migrate( celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/migrations/executor.py", line 117, in migrate celery-beat_1 | state = self._migrate_all_forwards(state, plan, full_plan, fake=fake, fake_initial=fake_initial) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/migrations/executor.py", line 147, in _migrate_all_forwards celery-beat_1 | state = self.apply_migration(state, migration, fake=fake, fake_initial=fake_initial) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/migrations/executor.py", line 245, in apply_migration celery-beat_1 | state = migration.apply(state, schema_editor) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/migrations/migration.py", line 124, in apply celery-beat_1 | operation.database_forwards(self.app_label, schema_editor, old_state, project_state) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/migrations/operations/models.py", line 92, in database_forwards celery-beat_1 | schema_editor.create_model(model) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/base/schema.py", line 324, in create_model celery-beat_1 | self.execute(sql, params or None) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/base/schema.py", line 142, in execute celery-beat_1 | cursor.execute(sql, params) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 100, in execute celery-beat_1 | return super().execute(sql, params) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 68, in execute celery-beat_1 | return self._execute_with_wrappers(sql, params, many=False, executor=self._execute) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 77, in _execute_with_wrappers celery-beat_1 | return executor(sql, params, many, context) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 86, in _execute celery-beat_1 | return self.cursor.execute(sql, params) celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/utils.py", line 90, in exit celery-beat_1 | raise dj_exc_value.with_traceback(traceback) from exc_value celery-beat_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute celery-beat_1 | return self.cursor.execute(sql) celery-beat_1 | django.db.utils.IntegrityError: duplicate key value violates unique constraint "pg_type_typname_nsp_index" celery-beat_1 | DETAIL: Key (typname, typnamespace)=(auth_permission_id_seq, 2200) already exists. celery-beat_1 | celery-beat_1 | Applying auth.0001_initial...Installed 3 object(s) from 1 fixture(s) celery-beat_1 | celery beat v4.4.7 (cliffs) is starting. celery-beat1 | - ... - celery-beat_1 | LocalTime -> 2020-11-23 10:57:16 celery-beat_1 | Configuration -> celery-beat_1 | . broker -> redis://redis:6379// celery-beat_1 | . loader -> celery.loaders.app.AppLoader celery-beat_1 | . scheduler -> django_celery_beat.schedulers.DatabaseScheduler celery-beat_1 | celery-beat_1 | . logfile -> [stderr]@%INFO celery-beat_1 | . maxinterval -> 5.00 seconds (5s) celery-beat_1 | [2020-11-23 10:57:16,374: INFO/MainProcess] beat: Starting... celery-beat_1 | [2020-11-23 10:57:16,374: INFO/MainProcess] Writing entries... celery-beat_1 | [2020-11-23 10:57:21,431: INFO/MainProcess] Writing entries... db_1 | The files belonging to this database system will be owned by user "postgres". db_1 | This user must also own the server process. db_1 | db_1 | The database cluster will be initialized with locale "en_US.utf8". db_1 | The default database encoding has accordingly been set to "UTF8". db_1 | The default text search configuration will be set to "english". db_1 | db_1 | Data page checksums are disabled. db_1 | db_1 | fixing permissions on existing directory /var/lib/postgresql/data ... ok db_1 | creating subdirectories ... ok db_1 | selecting dynamic shared memory implementation ... posix db_1 | selecting default max_connections ... 100 db_1 | selecting default shared_buffers ... 128MB db_1 | selecting default time zone ... UTC db_1 | creating configuration files ... ok db_1 | running bootstrap script ... ok db_1 | sh: locale: not found db_1 | 2020-11-23 10:57:10.135 UTC [29] WARNING: no usable system locales were found db_1 | performing post-bootstrap initialization ... ok db_1 | syncing data to disk ... ok db_1 | db_1 | db_1 | Success. You can now start the database server using: db_1 | db_1 | pg_ctl -D /var/lib/postgresql/data -l logfile start db_1 | db_1 | initdb: warning: enabling "trust" authentication for local connections db_1 | You can change this by editing pg_hba.conf or using the option -A, or db_1 | --auth-local and --auth-host, the next time you run initdb. db_1 | waiting for server to start....2020-11-23 10:57:10.705 UTC [34] LOG: starting PostgreSQL 12.3 on x86_64-pc-linux-musl, compiled by gcc (Alpine 9.3.0) 9.3.0, 64-bit db_1 | 2020-11-23 10:57:10.707 UTC [34] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432" db_1 | 2020-11-23 10:57:10.726 UTC [35] LOG: database system was shut down at 2020-11-23 10:57:10 UTC db_1 | 2020-11-23 10:57:10.729 UTC [34] LOG: database system is ready to accept connections db_1 | done db_1 | server started db_1 | db_1 | /usr/local/bin/docker-entrypoint.sh: ignoring /docker-entrypoint-initdb.d/ db_1 | db_1 | waiting for server to shut down....2020-11-23 10:57:10.792 UTC [34] LOG: received fast shutdown request db_1 | 2020-11-23 10:57:10.794 UTC [34] LOG: aborting any active transactions db_1 | 2020-11-23 10:57:10.795 UTC [34] LOG: background worker "logical replication launcher" (PID 41) exited with exit code 1 db_1 | 2020-11-23 10:57:10.795 UTC [36] LOG: shutting down db_1 | 2020-11-23 10:57:10.805 UTC [34] LOG: database system is shut down db_1 | done db_1 | server stopped db_1 | db_1 | PostgreSQL init process complete; ready for start up. db_1 | db_1 | 2020-11-23 10:57:10.901 UTC [1] LOG: starting PostgreSQL 12.3 on x86_64-pc-linux-musl, compiled by gcc (Alpine 9.3.0) 9.3.0, 64-bit db_1 | 2020-11-23 10:57:10.902 UTC [1] LOG: listening on IPv4 address "0.0.0.0", port 5432 db_1 | 2020-11-23 10:57:10.902 UTC [1] LOG: listening on IPv6 address "::", port 5432 db_1 | 2020-11-23 10:57:10.903 UTC [1] LOG: listening on Unix socket "/var/run/postgresql/.s.PGSQL.5432" db_1 | 2020-11-23 10:57:10.919 UTC [43] LOG: database system was shut down at 2020-11-23 10:57:10 UTC db_1 | 2020-11-23 10:57:10.921 UTC [1] LOG: database system is ready to accept connections db_1 | 2020-11-23 10:57:12.627 UTC [54] ERROR: relation "django_migrations" already exists db_1 | 2020-11-23 10:57:12.627 UTC [54] STATEMENT: CREATE TABLE "django_migrations" ("id" serial NOT NULL PRIMARY KEY, "app" varchar(255) NOT NULL, "name" varchar(255) NOT NULL, "applied" timestamp with time zone NOT NULL) db_1 | 2020-11-23 10:57:12.726 UTC [55] ERROR: duplicate key value violates unique constraint "pg_type_typname_nsp_index" db_1 | 2020-11-23 10:57:12.726 UTC [55] DETAIL: Key (typname, typnamespace)=(auth_permission_id_seq, 2200) already exists. db_1 | 2020-11-23 10:57:12.726 UTC [55] STATEMENT: CREATE TABLE "auth_permission" ("id" serial NOT NULL PRIMARY KEY, "name" varchar(50) NOT NULL, "content_type_id" integer NOT NULL, "codename" varchar(100) NOT NULL) redis_1 | 1:C 23 Nov 2020 10:57:09.569 # oO0OoO0OoO0Oo Redis is starting oO0OoO0OoO0Oo redis_1 | 1:C 23 Nov 2020 10:57:09.569 # Redis version=6.0.9, bits=64, commit=00000000, modified=0, pid=1, just started redis_1 | 1:C 23 Nov 2020 10:57:09.569 # Warning: no config file specified, using the default config. In order to specify a config file use redis-server /path/to/redis.conf redis_1 | 1:M 23 Nov 2020 10:57:09.571 Running mode=standalone, port=6379. redis_1 | 1:M 23 Nov 2020 10:57:09.571 # Server initialized redis_1 | 1:M 23 Nov 2020 10:57:09.571 # WARNING overcommit_memory is set to 0! Background save may fail under low memory condition. To fix this issue add 'vm.overcommit_memory = 1' to /etc/sysctl.conf and then reboot or run the command 'sysctl vm.overcommit_memory=1' for this to take effect. redis_1 | 1:M 23 Nov 2020 10:57:09.571 * Ready to accept connections celery_1 | Waiting for postgres... celery_1 | PostgreSQL started celery_1 | Operations to perform: celery_1 | Apply all migrations: admin, auth, contenttypes, django_celery_beat, notification, scanEngine, sessions, startScan, targetApp celery_1 | Running migrations: celery_1 | Applying contenttypes.0001_initial... OK celery_1 | Applying auth.0001_initial... OK celery_1 | Applying admin.0001_initial... OK celery_1 | Applying admin.0002_logentry_remove_auto_add... OK celery_1 | Applying admin.0003_logentry_add_action_flag_choices... OK celery_1 | Applying contenttypes.0002_remove_content_type_name... OK celery_1 | Applying auth.0002_alter_permission_name_max_length... OK celery_1 | Applying auth.0003_alter_user_email_max_length... OK celery_1 | Applying auth.0004_alter_user_username_opts... OK celery_1 | Applying auth.0005_alter_user_last_login_null... OK celery_1 | Applying auth.0006_require_contenttypes_0002... OK celery_1 | Applying auth.0007_alter_validators_add_error_messages... OK celery_1 | Applying auth.0008_alter_user_username_max_length... OK celery_1 | Applying auth.0009_alter_user_last_name_max_length... OK celery_1 | Applying auth.0010_alter_group_name_max_length... OK celery_1 | Applying auth.0011_update_proxy_permissions... OK celery_1 | Applying django_celery_beat.0001_initial... OK celery_1 | Applying django_celery_beat.0002_auto_20161118_0346... OK celery_1 | Applying django_celery_beat.0003_auto_20161209_0049... OK celery_1 | Applying django_celery_beat.0004_auto_20170221_0000... OK celery_1 | Applying django_celery_beat.0005_add_solarschedule_events_choices... OK celery_1 | Applying django_celery_beat.0006_auto_20180322_0932... OK celery_1 | Applying django_celery_beat.0007_auto_20180521_0826... OK celery_1 | Applying django_celery_beat.0008_auto_20180914_1922... OK celery_1 | Applying django_celery_beat.0006_auto_20180210_1226... OK celery_1 | Applying django_celery_beat.0006_periodictask_priority... OK celery_1 | Applying django_celery_beat.0009_periodictask_headers... OK celery_1 | Applying django_celery_beat.0010_auto_20190429_0326... OK celery_1 | Applying django_celery_beat.0011_auto_20190508_0153... OK celery_1 | Applying django_celery_beat.0012_periodictask_expire_seconds... OK celery_1 | Applying notification.0001_initial... OK celery_1 | Applying scanEngine.0001_initial... OK celery_1 | Applying scanEngine.0002_auto_20200717_1414... OK celery_1 | Applying scanEngine.0003_wordlist... OK celery_1 | Applying scanEngine.0004_wordlist_short_name... OK celery_1 | Applying scanEngine.0005_auto_20200718_0407... OK celery_1 | Applying scanEngine.0006_auto_20200718_0429... OK celery_1 | Applying scanEngine.0007_remove_wordlist_path... OK celery_1 | Applying scanEngine.0008_configuration... OK celery_1 | Applying scanEngine.0009_auto_20200725_1822... OK celery_1 | Applying sessions.0001_initial... OK celery_1 | Applying targetApp.0001_initial... OK celery_1 | Applying startScan.0001_initial... OK celery_1 | Applying startScan.0002_scanhistory_celery_id... OK celery_1 | Applying startScan.0003_waybackendpoint_content_type... OK celery_1 | Applying startScan.0004_scannedhost_is_ip_cdn... OK celery_1 | Applying startScan.0005_scannedhost_cname... OK celery_1 | Installed 3 object(s) from 1 fixture(s) celery_1 | /usr/local/lib/python3.8/site-packages/celery/platforms.py:800: RuntimeWarning: You're running the worker with superuser privileges: this is celery_1 | absolutely not recommended! celery_1 | celery_1 | Please specify a different user using the --uid option. celery_1 | celery_1 | User information: uid=0 euid=0 gid=0 egid=0 celery_1 | celery_1 | warnings.warn(RuntimeWarning(ROOT_DISCOURAGED.format( celery_1 |
celery_1 | -------------- celery@03d3c9c207bb v4.4.7 (cliffs) celery_1 | --- * ----- celery_1 | -- *** ---- Linux-5.4.0-53-generic-x86_64-with 2020-11-23 10:57:17 celery_1 | - --- --- celery_1 | - ---------- [config] celery_1 | - ---------- .> app: reNgine:0x7f1a6da56ac0 celery_1 | - ---------- .> transport: redis://redis:6379// celery_1 | - ---------- .> results: redis://redis:6379/ celery_1 | - --- --- .> concurrency: 4 (prefork) celery_1 | -- *** ---- .> task events: OFF (enable -E to monitor tasks in this worker) celery_1 | --- * ----- celery_1 | -------------- [queues] celery_1 | .> celery exchange=celery(direct) key=celery celery_1 |
celery_1 | celery_1 | [tasks] celery_1 | . reNgine.tasks.doScan celery_1 | . reNgine.tasks.test_task celery_1 | celery_1 | [2020-11-23 10:57:17,427: INFO/MainProcess] Connected to redis://redis:6379// celery_1 | [2020-11-23 10:57:17,435: INFO/MainProcess] mingle: searching for neighbors celery_1 | [2020-11-23 10:57:18,456: INFO/MainProcess] mingle: all alone celery_1 | [2020-11-23 10:57:18,473: WARNING/MainProcess] /usr/local/lib/python3.8/site-packages/celery/fixups/django.py:205: UserWarning: Using settings.DEBUG leads to a memory celery_1 | leak, never use this setting in production environments! celery_1 | warnings.warn('''Using settings.DEBUG leads to a memory celery_1 | [2020-11-23 10:57:18,473: INFO/MainProcess] celery@03d3c9c207bb ready. web_1 | Waiting for postgres... web_1 | PostgreSQL started web_1 | Operations to perform: web_1 | Apply all migrations: admin, auth, contenttypes, django_celery_beat, notification, scanEngine, sessions, startScan, targetApp web_1 | Running migrations: web_1 | Traceback (most recent call last): web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute web_1 | return self.cursor.execute(sql) web_1 | psycopg2.errors.DuplicateTable: relation "django_migrations" already exists web_1 | web_1 | web_1 | The above exception was the direct cause of the following exception: web_1 | web_1 | Traceback (most recent call last): web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/migrations/recorder.py", line 67, in ensure_schema web_1 | editor.create_model(self.Migration) web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/base/schema.py", line 324, in create_model web_1 | self.execute(sql, params or None) web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/base/schema.py", line 142, in execute web_1 | cursor.execute(sql, params) web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 68, in execute web_1 | return self._execute_with_wrappers(sql, params, many=False, executor=self._execute) web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 77, in _execute_with_wrappers web_1 | return executor(sql, params, many, context) web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 86, in _execute web_1 | return self.cursor.execute(sql, params) web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/utils.py", line 90, in exit web_1 | raise dj_exc_value.with_traceback(traceback) from exc_value web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/backends/utils.py", line 84, in _execute web_1 | return self.cursor.execute(sql) web_1 | django.db.utils.ProgrammingError: relation "django_migrations" already exists web_1 | web_1 | web_1 | During handling of the above exception, another exception occurred: web_1 | web_1 | Traceback (most recent call last): web_1 | File "manage.py", line 21, in web_1 | main() web_1 | File "manage.py", line 17, in main web_1 | execute_from_command_line(sys.argv) web_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/init.py", line 401, in execute_from_command_line web_1 | utility.execute() web_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/init.py", line 395, in execute web_1 | self.fetch_command(subcommand).run_from_argv(self.argv) web_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 328, in run_from_argv web_1 | self.execute(*args, cmd_options) web_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 369, in execute web_1 | output = self.handle(*args, *options) web_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/base.py", line 83, in wrapped web_1 | res = handle_func(args, kwargs) web_1 | File "/usr/local/lib/python3.8/site-packages/django/core/management/commands/migrate.py", line 231, in handle web_1 | post_migrate_state = executor.migrate( web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/migrations/executor.py", line 91, in migrate web_1 | self.recorder.ensure_schema() web_1 | File "/usr/local/lib/python3.8/site-packages/django/db/migrations/recorder.py", line 69, in ensure_schema web_1 | raise MigrationSchemaMissing("Unable to create the django_migrations table (%s)" % exc) web_1 | django.db.migrations.exceptions.MigrationSchemaMissing: Unable to create the django_migrations table (relation "django_migrations" already exists web_1 | ) web_1 | Installed 3 object(s) from 1 fixture(s) web_1 | [2020-11-23 10:57:15 +0000] [1] [INFO] Starting gunicorn 20.0.4 web_1 | [2020-11-23 10:57:15 +0000] [1] [INFO] Listening at: http://0.0.0.0:8000 (1) web_1 | [2020-11-23 10:57:15 +0000] [1] [INFO] Using worker: sync web_1 | [2020-11-23 10:57:15 +0000] [20] [INFO] Booting worker with pid: 20 proxy_1 | /docker-entrypoint.sh: /docker-entrypoint.d/ is not empty, will attempt to perform configuration proxy_1 | /docker-entrypoint.sh: Looking for shell scripts in /docker-entrypoint.d/ proxy_1 | /docker-entrypoint.sh: Launching /docker-entrypoint.d/10-listen-on-ipv6-by-default.sh proxy_1 | 10-listen-on-ipv6-by-default.sh: Getting the checksum of /etc/nginx/conf.d/default.conf proxy_1 | 10-listen-on-ipv6-by-default.sh: Enabled listen on IPv6 in /etc/nginx/conf.d/default.conf proxy_1 | /docker-entrypoint.sh: Launching /docker-entrypoint.d/20-envsubst-on-templates.sh proxy_1 | /docker-entrypoint.sh: Configuration complete; ready for start up proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET / HTTP/2.0" 302 0 "https://127.0.0.1/start_scan/detail/1" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET /login/?next=/ HTTP/2.0" 200 4288 "https://127.0.0.1/start_scan/detail/1" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET /staticfiles/assets/js/libs/jquery-3.1.1.min.js HTTP/2.0" 304 0 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET /staticfiles/bootstrap/js/popper.min.js HTTP/2.0" 304 0 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET /staticfiles/bootstrap/js/bootstrap.min.js HTTP/2.0" 304 0 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET /staticfiles/img/logo.png HTTP/2.0" 304 0 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET /staticfiles/assets/css/authentication/form-2.css HTTP/2.0" 304 0 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET /staticfiles/assets/css/forms/switches.css HTTP/2.0" 304 0 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET /staticfiles/assets/js/authentication/form-2.js HTTP/2.0" 304 0 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:34 +0000] "GET /favicon.ico HTTP/2.0" 404 179 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:39 +0000] "POST /login/?next=/ HTTP/2.0" 302 0 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:39 +0000] "GET / HTTP/2.0" 200 27869 "https://127.0.0.1/login/?next=/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:57:39 +0000] "GET /favicon.ico HTTP/2.0" 404 179 "https://127.0.0.1/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:58:13 +0000] "GET /target/add/ HTTP/2.0" 200 21696 "https://127.0.0.1/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:58:22 +0000] "POST /target/add/ HTTP/2.0" 302 0 "https://127.0.0.1/target/add/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:58:22 +0000] "GET /target/list/ HTTP/2.0" 200 30484 "https://127.0.0.1/target/add/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:58:24 +0000] "GET /start_scan/start/1 HTTP/2.0" 200 42480 "https://127.0.0.1/target/list/" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" celery_1 | [2020-11-23 10:58:29,916: INFO/MainProcess] Received task: reNgine.tasks.doScan[7660a41c-6d3c-4faf-b780-bdac4259f6a7]
proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:58:29 +0000] "POST /start_scan/start/1 HTTP/2.0" 302 0 "https://127.0.0.1/start_scan/start/1" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" proxy_1 | 172.21.0.1 - - [23/Nov/2020:10:58:29 +0000] "GET /start_scan/history/ HTTP/2.0" 200 28385 "https://127.0.0.1/start_scan/start/1" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:83.0) Gecko/20100101 Firefox/83.0" "-" celery_1 | celery1 | celery1 | ____ | | / () | |_ celery1 | (-< || | ' \ | | ' \/ / -) '_| celery_1 | //_,|./| ||||_,_|_| v2.4.5 celery_1 | celery_1 | projectdiscovery.io celery_1 | celery_1 | [WRN] Use with caution. You are responsible for your actions celery_1 | [WRN] Developers assume no liability and are not responsible for any misuse or damage. celery_1 | [WRN] By using subfinder, you also agree to the terms of the APIs used. celery_1 | celery_1 | [INF] Configuration file saved to /root/.config/subfinder/config.yaml celery_1 | [INF] Enumerating subdomains for oppo.com celery_1 | [INF] Found 83 subdomains for oppo.com in 30 seconds 5 milliseconds celery_1 | celery1 | ____ _ ___ celery_1 | / | _| | | ()| |_|__ / celery_1 | _ | | | | ' | | / | | | | '| celery_1 | ) | || | |) | | _ \ |_ _) | | celery_1 | |/ _,|_./|||/\|____/|| celery_1 | celery_1 | # Coded By Ahmed Aboul-Ela - @aboul3la celery_1 |
celery_1 | [-] Enumerating subdomains now for oppo.com celery_1 | [-] Searching now in Baidu.. celery_1 | [-] Searching now in Yahoo.. celery_1 | [-] Searching now in Google.. celery_1 | [-] Searching now in Bing.. celery_1 | [-] Searching now in Ask.. celery_1 | [-] Searching now in Netcraft.. celery_1 | [-] Searching now in DNSdumpster.. celery_1 | [-] Searching now in Virustotal.. celery_1 | [-] Searching now in ThreatCrowd.. celery_1 | [-] Searching now in SSL Certificates.. celery_1 | [-] Searching now in PassiveDNS.. celery_1 | HTTPSConnectionPool(host='dnsdumpster.com', port=443): Max retries exceeded with url: / (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fcff54d1e80>: Failed to establish a new connection: [Errno 111] Connection refused')) celery_1 | Process DNSdumpster-8: celery_1 | Traceback (most recent call last): celery_1 | File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap celery_1 | self.run() celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 269, in run celery_1 | domain_list = self.enumerate() celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 649, in enumerate celery_1 | token = self.get_csrftoken(resp) celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 644, in get_csrftoken celery_1 | token = csrf_regex.findall(resp)[0] celery_1 | TypeError: expected string or bytes-like object celery_1 | Process GoogleEnum-4: celery_1 | Traceback (most recent call last): celery_1 | File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap celery_1 | self.run() celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 269, in run celery_1 | domain_list = self.enumerate() celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 240, in enumerate celery_1 | if not self.check_response_errors(resp): celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 304, in check_response_errors celery_1 | if (type(resp) is str or type(resp) is unicode) and 'Our systems have detected unusual traffic' in resp: celery_1 | NameError: name 'unicode' is not defined celery_1 | HTTPSConnectionPool(host='www.virustotal.com', port=443): Max retries exceeded with url: /ui/domains/oppo.com/subdomains (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fcff54db490>: Failed to establish a new connection: [Errno 111] Connection refused')) celery_1 | Process Virustotal-9: celery_1 | Traceback (most recent call last): celery_1 | File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap celery_1 | self.run() celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 269, in run celery_1 | domain_list = self.enumerate() celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 703, in enumerate celery_1 | resp = json.loads(resp) celery_1 | File "/usr/local/lib/python3.8/json/init.py", line 341, in loads celery_1 | raise TypeError(f'the JSON object must be str, bytes or bytearray, ' celery_1 | TypeError: the JSON object must be str, bytes or bytearray, not int celery_1 | HTTPSConnectionPool(host='searchdns.netcraft.com', port=443): Max retries exceeded with url: /?restriction=site+ends+with&host=example.com (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x7fcff54dc640>: Failed to establish a new connection: [Errno 111] Connection refused')) celery_1 | Process NetcraftEnum-7: celery_1 | Traceback (most recent call last): celery_1 | File "/usr/local/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap celery_1 | self.run() celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 269, in run celery_1 | domain_list = self.enumerate() celery_1 | File "/app/tools/Sublist3r/sublist3r.py", line 570, in enumerate celery_1 | cookies = self.get_cookies(resp.headers) celery_1 | AttributeError: 'NoneType' object has no attribute 'headers' celery_1 | [-] Saving results to file: oppo.com_2020_11_23_10_58_29/from_sublister.txt celery_1 | [-] Total Unique Subdomains Found: 19 celery_1 | myoppo.com celery_1 | www.oppo.com celery_1 | account.oppo.com celery_1 | www.account.oppo.com celery_1 | assorted.downloads.oppo.com celery_1 | en.oppo.com celery_1 | d.theme.exapi.oppo.com celery_1 | expe1.oppo.com celery_1 | www.expe1.oppo.com celery_1 | mmevents-hd.oppo.com celery_1 | mmmail.oppo.com celery_1 | mx.oppo.com celery_1 | www.mx.oppo.com celery_1 | my.oppo.com celery_1 | myevents-hd.oppo.com celery_1 | o-doctor.oppo.com celery_1 | oldcms.oppo.com celery_1 | ruevents-hd.oppo.com celery_1 | webexexpe.oppo.com celery_1 | celery_1 |

celery_1 | / / celery1 | / \/ \/ \/ _ \/ // / celery1 | ////_,/_,/./_,_/ v1
celery_1 | celery_1 | projectdiscovery.io celery_1 | celery_1 | [WRN] Use with caution. You are responsible for your actions celery_1 | [WRN] Developers assume no liability and are not responsible for any misuse or damage. celery_1 | [INF] Using host business.oppo.com for enumeration celery_1 | [INF] Starting scan on host business.oppo.com (106.3.18.159) celery_1 | [INF] Using host bdev.oppo.com for enumeration celery_1 | [INF] Starting scan on host bdev.oppo.com (119.147.175.93) celery_1 | [INF] Using host career.oppo.com for enumeration celery_1 | [INF] Starting scan on host career.oppo.com (119.147.98.14) celery_1 | [INF] Using host assorted.downloads.oppo.com for enumeration celery_1 | [INF] Starting scan on host assorted.downloads.oppo.com (96.7.129.163) celery_1 | [INF] No ports found on business.oppo.com (106.3.18.159). Host seems down celery_1 | [INF] Found 1 ports on host bdev.oppo.com (119.147.175.93) celery_1 | {"host":"bdev.oppo.com","port":25} celery_1 | [INF] Found 1 ports on host career.oppo.com (119.147.98.14) celery_1 | {"host":"career.oppo.com","port":110} celery-beat_1 | [2020-11-23 11:00:21,814: INFO/MainProcess] Writing entries... celery_1 | [INF] No ports found on assorted.downloads.oppo.com (96.7.129.163). Host seems down celery_1 | [INF] Using host myoppo.com for enumeration celery_1 | [INF] Starting scan on host myoppo.com (106.3.18.183) celery_1 | {"host":"myoppo.com","port":110} celery_1 | {"host":"myoppo.com","port":25} celery_1 | [INF] Found 2 ports on host myoppo.com (106.3.18.183) celery_1 | [INF] Using host prehd.oppo.com for enumeration celery_1 | [INF] Starting scan on host prehd.oppo.com (106.3.18.243) celery_1 | [INF] Using host push.oppo.com for enumeration celery_1 | [INF] Starting scan on host push.oppo.com (36.110.222.129) celery_1 | {"host":"prehd.oppo.com","port":110} celery_1 | [INF] No ports found on push.oppo.com (36.110.222.129). Host seems down celery_1 | [INF] Found 1 ports on host prehd.oppo.com (106.3.18.243) celery_1 | [INF] Using host yihuan.oppo.com for enumeration celery_1 | [INF] Starting scan on host yihuan.oppo.com (36.110.222.91) celery_1 | [INF] Using host yun.oppo.com for enumeration celery_1 | [INF] Starting scan on host yun.oppo.com (106.3.18.171) celery_1 | [INF] Found 1 ports on host yihuan.oppo.com (36.110.222.91) celery_1 | {"host":"yihuan.oppo.com","port":110} celery_1 | [INF] No ports found on yun.oppo.com (106.3.18.171). Host seems down celery_1 | flag provided but not defined: -cdn celery_1 | Usage of httpx: celery_1 | -H value celery_1 | Custom Header celery_1 | -content-length celery_1 | Content Length celery_1 | -follow-host-redirects celery_1 | Only follow redirects on the same host celery_1 | -follow-redirects celery_1 | Follow Redirects celery_1 | -http-proxy string celery_1 | Http Proxy celery_1 | -json celery_1 | JSON Output celery_1 | -l string celery_1 | File containing domains celery_1 | -no-color celery_1 | No Color celery_1 | -o string celery_1 | File to write output to (optional) celery_1 | -ports value celery_1 | ports range (nmap syntax: eg 1,2-10,11) celery_1 | -response-in-json celery_1 | Server response directly in the tool output (-json only) celery_1 | -retries int celery_1 | Number of retries celery_1 | -silent celery_1 | Silent mode celery_1 | -status-code celery_1 | Extracts Status Code celery_1 | -store-response celery_1 | Store Response as domain.txt celery_1 | -store-response-dir string celery_1 | Store Response Directory (default current directory) (default ".") celery_1 | -threads int celery_1 | Number of threads (default 50) celery_1 | -timeout int celery_1 | Timeout in seconds (default 5) celery_1 | -title celery_1 | Extracts title celery_1 | -verbose celery_1 | Verbose Mode celery_1 | -version celery_1 | Show version of httpx celery_1 | -vhost celery_1 | Check for VHOSTs celery_1 | -web-server celery_1 | Prints out the Server header content celery_1 | -x string celery_1 | Request Method (default "GET") celery_1 | [2020-11-23 11:02:59,567: WARNING/ForkPoolWorker-4] ------------------------------ celery_1 | [2020-11-23 11:02:59,567: WARNING/ForkPoolWorker-4] [Errno 2] No such file or directory: '/app/tools/scan_results/oppo.com_2020_11_23_10_58_29/httpx.json' celery_1 | [2020-11-23 11:02:59,567: WARNING/ForkPoolWorker-4] ------------------------------ celery_1 | [2020-11-23 11:02:59,578: INFO/ForkPoolWorker-4] Task reNgine.tasks.doScan[7660a41c-6d3c-4faf-b780-bdac4259f6a7] succeeded in 269.65978124699905s: {'status': True} celery-beat_1 | [2020-11-23 11:03:22,157: INFO/MainProcess] Writing entries... celery-beat_1 | [2020-11-23 11:06:22,562: INFO/MainProcess] Writing entries... celery-beat_1 | [2020-11-23 11:09:22,981: INFO/MainProcess] Writing entries... celery-beat_1 | [2020-11-23 11:12:23,338: INFO/MainProcess] Writing entries...

tbela99 commented 3 years ago

I have the same issue with an invalid (misconfigured) TLS certificate. There is no way to ignore TLS errors