yogeshojha / rengine

reNgine is an automated reconnaissance framework for web applications with a focus on highly configurable streamlined recon process via Engines, recon data correlation and organization, continuous monitoring, backed by a database, and simple yet intuitive User Interface. reNgine makes it easy for penetration testers to gather reconnaissance with minimal configuration and with the help of reNgine's correlation, it just makes recon effortless.
https://yogeshojha.github.io/rengine/
GNU General Public License v3.0
7.53k stars 1.14k forks source link

bug: <0 vuln found and keep in progress forever> #1446

Open lintianyuan666 opened 2 months ago

lintianyuan666 commented 2 months ago

Is there an existing issue for this?

Current Behavior

I scaned many targets but no vuln found while zap find many vulns.And the scan keeps in progress for 2 days.It is only 1 target.

Expected Behavior

I hope there would be at least 1 vuln

Steps To Reproduce

1、version is 2.2.0 2、target is perceptyx.com

Environment

- reNgine: 2.2.0
- OS: ubantu 22
- Python: 3.10
- Docker Engine: 27.2.1
- Docker Compose: none.
- Browser: chrome 128.0.6613.138

Anything else?

No response

github-actions[bot] commented 2 months ago

Hey @lintianyuan666! 👋 Thanks for flagging this bug! 🐛🔍

You're our superhero bug hunter! 🦸‍♂️🦸‍♀️ Before we suit up to squash this bug, could you please:

📚 Double-check our documentation: https://rengine.wiki 🕵️ Make sure it's not a known issue 📝 Provide all the juicy details about this sneaky bug

Once again - thanks for your vigilance! 🛠️🚀

yogeshojha commented 2 months ago

@lintianyuan666 does the recon find any http URLs at least?

lintianyuan666 commented 2 months ago

@lintianyuan666 does the recon find any http URLs at least?

nothing found.

屏幕截图 2024-09-23 220222

I restart the scan a hour ago.But i have scaned this domain for 3 days,it still have the result of this picture.

yogeshojha commented 2 months ago

Stragne, I checked it just with subfinder, and target looks fine. I am now checking in reNgine. Do you have proxy setup or VPN? What does your yaml config look like, I would like to see

lintianyuan666 commented 2 months ago

Stragne, I checked it just with subfinder, and target looks fine. I am now checking in reNgine. Do you have proxy setup or VPN? What does your yaml config look like, I would like to see

Thanks for attention.I didn't use proxy or vpn.Here is my yaml config root@adfcc:/soft/rengine# cat default_yaml_config.yaml

Global vars for all tools

#

custom_headers: ['Foo: bar', 'User-Agent: Anything'] # FFUF, Nuclei, Dalfox, CRL Fuzz, HTTP Crawl, Fetch URL, etc

enable_http_crawl: true # All tools

timeout: 10 # Subdomain discovery, Screenshot, Port scan, FFUF, Nuclei

threads: 30 # All tools

rate_limit: 150 # Port scan, FFUF, Nuclei

intensity: 'normal' # Screenshot (grab only the root endpoints of each subdomain), Nuclei (reduce number of endpoints to scan), OSINT (not implemented yet)

retries: 1 # Nuclei

subdomain_discovery: { 'uses_tools': ['subfinder', 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'], # amass-passive, amass-active, All 'enable_http_crawl': true, 'threads': 30, 'timeout': 5,

'use_subfinder_config': false,

'use_amass_config': false,

'amass_wordlist': 'deepmagic.com-prefixes-top50000'

} http_crawl: {

'threads': 30,

'follow_redirect': true

} port_scan: { 'enable_http_crawl': true, 'timeout': 5,

'exclude_ports': [],

'exclude_subdomains': [],

'ports': ['top-100'], 'rate_limit': 150, 'threads': 30, 'passive': false,

'use_naabu_config': false,

'enable_nmap': true,

'nmap_cmd': '',

'nmap_script': '',

'nmap_script_args': ''

} osint: { 'discover': [ 'emails', 'metainfo', 'employees' ], 'dorks': [ 'login_pages', 'admin_panels', 'dashboard_pages', 'stackoverflow', 'social_media', 'project_management', 'code_sharing', 'config_files', 'jenkins', 'wordpress_files', 'php_error', 'exposed_documents', 'db_files', 'git_exposed' ],

'custom_dorks': [],

'intensity': 'normal', 'documents_limit': 50 } dir_file_fuzz: { 'auto_calibration': true, 'enable_http_crawl': true, 'rate_limit': 150, 'extensions': ['html', 'php','git','yaml','conf','cnf','config','gz','env','log','db','mysql','bak','asp','aspx','txt','conf','sql','json','yml','pdf'], 'follow_redirect': false, 'max_time': 0, 'match_http_status': [200, 204], 'recursive_level': 2, 'stop_on_error': false, 'timeout': 5, 'threads': 30, 'wordlist_name': 'dicc' } fetch_url: { 'uses_tools': ['gospider', 'hakrawler', 'waybackurls', 'katana', 'gau'], 'remove_duplicate_endpoints': true, 'duplicate_fields': ['content_length', 'page_title'], 'enable_http_crawl': true, 'gf_patterns': ['debug_logic', 'idor', 'interestingEXT', 'interestingparams', 'interestingsubs', 'lfi', 'rce', 'redirect', 'sqli', 'ssrf', 'ssti', 'xss'], 'ignore_file_extensions': ['png', 'jpg', 'jpeg', 'gif', 'mp4', 'mpeg', 'mp3'], 'threads': 30,

'exclude_subdomains': false

} vulnerability_scan: { 'run_nuclei': true, 'run_dalfox': false, 'run_crlfuzz': false, 'run_s3scanner': false, 'enable_http_crawl': true, 'concurrency': 50, 'intensity': 'normal', 'rate_limit': 150, 'retries': 1, 'timeout': 5, 'fetch_gpt_report': true, 'nuclei': { 'use_nuclei_config': false, 'severities': ['unknown', 'info', 'low', 'medium', 'high', 'critical'],

'tags': [], # Nuclei tags (https://github.com/projectdiscovery/nuclei-templates)

# 'templates': [],            # Nuclei templates (https://github.com/projectdiscovery/nuclei-templates)
# 'custom_templates': []      # Nuclei custom templates uploaded in reNgine

} } waf_detection: { 'enable_http_crawl': true } screenshot: { 'enable_http_crawl': true, 'intensity': 'normal', 'timeout': 10, 'threads': 40 }

This is my yaml config when I select full scan when initializing the scan on the web page

subdomain_discovery: { 'uses_tools': ['subfinder', 'chaos', 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'], 'enable_http_crawl': true, 'threads': 30, 'timeout': 5, } http_crawl: {} port_scan: { 'enable_http_crawl': true, 'timeout': 5,

'exclude_ports': [],

'exclude_subdomains': [],

'ports': ['top-100'], 'rate_limit': 150, 'threads': 30, 'passive': false,

'use_naabu_config': false,

'enable_nmap': true,

'nmap_cmd': '',

'nmap_script': '',

'nmap_script_args': ''

} osint: { 'discover': [ 'emails', 'metainfo', 'employees' ], 'dorks': [ 'login_pages', 'admin_panels', 'dashboard_pages', 'stackoverflow', 'social_media', 'project_management', 'code_sharing', 'config_files', 'jenkins', 'wordpress_files', 'php_error', 'exposed_documents', 'db_files', 'git_exposed' ], 'intensity': 'normal', 'documents_limit': 50 } dir_file_fuzz: { 'auto_calibration': true, 'enable_http_crawl': true, 'rate_limit': 150, 'extensions': ['html', 'php','git','yaml','conf','cnf','config','gz','env','log','db','mysql','bak','asp','aspx','txt','conf','sql','json','yml','pdf'], 'follow_redirect': false, 'max_time': 0, 'match_http_status': [200, 204], 'recursive_level': 2, 'stop_on_error': false, 'timeout': 5, 'threads': 30, 'wordlist_name': 'dicc' } fetch_url: { 'uses_tools': ['gospider', 'hakrawler', 'waybackurls', 'katana', 'gau'], 'remove_duplicate_endpoints': true, 'duplicate_fields': ['content_length', 'page_title'], 'enable_http_crawl': true, 'gf_patterns': ['debug_logic', 'idor', 'interestingEXT', 'interestingparams', 'interestingsubs', 'lfi', 'rce', 'redirect', 'sqli', 'ssrf', 'ssti', 'xss'], 'ignore_file_extensions': ['png', 'jpg', 'jpeg', 'gif', 'mp4', 'mpeg', 'mp3'], 'threads': 30 } vulnerability_scan: { 'run_nuclei': true, 'run_dalfox': true, 'run_crlfuzz': true, 'enable_http_crawl': true, 'concurrency': 50, 'intensity': 'normal', 'rate_limit': 150, 'retries': 1, 'timeout': 5, 'fetch_gpt_report': false, 'nuclei': { 'use_nuclei_config': false, 'severities': ['unknown', 'info', 'low', 'medium', 'high', 'critical'] } } waf_detection: {

} screenshot: { 'enable_http_crawl': true, 'intensity': 'normal', 'timeout': 10, 'threads': 40 }

custom_headers: ["Cookie: Test"]

ncharron commented 1 month ago

Can you post your celery-entrypoint.sh file? Only lines roughly 160 to 205.

lintianyuan666 commented 1 month ago

Maybe it's because my full scan is misconfigured. Vulnerabilities can be scanned using the recommended configuration

rezytijo commented 3 weeks ago

I have the same issue. its happen on all my server, i use ARM VPS on Oracle and my Kali VM on my home PC image

I dont use any proxy or VPN, any recomendations?

rezytijo commented 3 weeks ago

i found something here image

there is file call urls_unfurled.txt i didn't find this file when opening docker volume, i think this is the source of the problem image