Open lintianyuan666 opened 2 months ago
Hey @lintianyuan666! 👋 Thanks for flagging this bug! 🐛🔍
You're our superhero bug hunter! 🦸♂️🦸♀️ Before we suit up to squash this bug, could you please:
📚 Double-check our documentation: https://rengine.wiki 🕵️ Make sure it's not a known issue 📝 Provide all the juicy details about this sneaky bug
Once again - thanks for your vigilance! 🛠️🚀
@lintianyuan666 does the recon find any http URLs at least?
@lintianyuan666 does the recon find any http URLs at least?
nothing found.
I restart the scan a hour ago.But i have scaned this domain for 3 days,it still have the result of this picture.
Stragne, I checked it just with subfinder, and target looks fine. I am now checking in reNgine. Do you have proxy setup or VPN? What does your yaml config look like, I would like to see
Stragne, I checked it just with subfinder, and target looks fine. I am now checking in reNgine. Do you have proxy setup or VPN? What does your yaml config look like, I would like to see
Thanks for attention.I didn't use proxy or vpn.Here is my yaml config root@adfcc:/soft/rengine# cat default_yaml_config.yaml
#
subdomain_discovery: { 'uses_tools': ['subfinder', 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'], # amass-passive, amass-active, All 'enable_http_crawl': true, 'threads': 30, 'timeout': 5,
} http_crawl: {
} port_scan: { 'enable_http_crawl': true, 'timeout': 5,
'ports': ['top-100'], 'rate_limit': 150, 'threads': 30, 'passive': false,
} osint: { 'discover': [ 'emails', 'metainfo', 'employees' ], 'dorks': [ 'login_pages', 'admin_panels', 'dashboard_pages', 'stackoverflow', 'social_media', 'project_management', 'code_sharing', 'config_files', 'jenkins', 'wordpress_files', 'php_error', 'exposed_documents', 'db_files', 'git_exposed' ],
'intensity': 'normal', 'documents_limit': 50 } dir_file_fuzz: { 'auto_calibration': true, 'enable_http_crawl': true, 'rate_limit': 150, 'extensions': ['html', 'php','git','yaml','conf','cnf','config','gz','env','log','db','mysql','bak','asp','aspx','txt','conf','sql','json','yml','pdf'], 'follow_redirect': false, 'max_time': 0, 'match_http_status': [200, 204], 'recursive_level': 2, 'stop_on_error': false, 'timeout': 5, 'threads': 30, 'wordlist_name': 'dicc' } fetch_url: { 'uses_tools': ['gospider', 'hakrawler', 'waybackurls', 'katana', 'gau'], 'remove_duplicate_endpoints': true, 'duplicate_fields': ['content_length', 'page_title'], 'enable_http_crawl': true, 'gf_patterns': ['debug_logic', 'idor', 'interestingEXT', 'interestingparams', 'interestingsubs', 'lfi', 'rce', 'redirect', 'sqli', 'ssrf', 'ssti', 'xss'], 'ignore_file_extensions': ['png', 'jpg', 'jpeg', 'gif', 'mp4', 'mpeg', 'mp3'], 'threads': 30,
} vulnerability_scan: { 'run_nuclei': true, 'run_dalfox': false, 'run_crlfuzz': false, 'run_s3scanner': false, 'enable_http_crawl': true, 'concurrency': 50, 'intensity': 'normal', 'rate_limit': 150, 'retries': 1, 'timeout': 5, 'fetch_gpt_report': true, 'nuclei': { 'use_nuclei_config': false, 'severities': ['unknown', 'info', 'low', 'medium', 'high', 'critical'],
# 'templates': [], # Nuclei templates (https://github.com/projectdiscovery/nuclei-templates)
# 'custom_templates': [] # Nuclei custom templates uploaded in reNgine
} } waf_detection: { 'enable_http_crawl': true } screenshot: { 'enable_http_crawl': true, 'intensity': 'normal', 'timeout': 10, 'threads': 40 }
This is my yaml config when I select full scan when initializing the scan on the web page
subdomain_discovery: { 'uses_tools': ['subfinder', 'chaos', 'ctfr', 'sublist3r', 'tlsx', 'oneforall', 'netlas'], 'enable_http_crawl': true, 'threads': 30, 'timeout': 5, } http_crawl: {} port_scan: { 'enable_http_crawl': true, 'timeout': 5,
'ports': ['top-100'], 'rate_limit': 150, 'threads': 30, 'passive': false,
} osint: { 'discover': [ 'emails', 'metainfo', 'employees' ], 'dorks': [ 'login_pages', 'admin_panels', 'dashboard_pages', 'stackoverflow', 'social_media', 'project_management', 'code_sharing', 'config_files', 'jenkins', 'wordpress_files', 'php_error', 'exposed_documents', 'db_files', 'git_exposed' ], 'intensity': 'normal', 'documents_limit': 50 } dir_file_fuzz: { 'auto_calibration': true, 'enable_http_crawl': true, 'rate_limit': 150, 'extensions': ['html', 'php','git','yaml','conf','cnf','config','gz','env','log','db','mysql','bak','asp','aspx','txt','conf','sql','json','yml','pdf'], 'follow_redirect': false, 'max_time': 0, 'match_http_status': [200, 204], 'recursive_level': 2, 'stop_on_error': false, 'timeout': 5, 'threads': 30, 'wordlist_name': 'dicc' } fetch_url: { 'uses_tools': ['gospider', 'hakrawler', 'waybackurls', 'katana', 'gau'], 'remove_duplicate_endpoints': true, 'duplicate_fields': ['content_length', 'page_title'], 'enable_http_crawl': true, 'gf_patterns': ['debug_logic', 'idor', 'interestingEXT', 'interestingparams', 'interestingsubs', 'lfi', 'rce', 'redirect', 'sqli', 'ssrf', 'ssti', 'xss'], 'ignore_file_extensions': ['png', 'jpg', 'jpeg', 'gif', 'mp4', 'mpeg', 'mp3'], 'threads': 30 } vulnerability_scan: { 'run_nuclei': true, 'run_dalfox': true, 'run_crlfuzz': true, 'enable_http_crawl': true, 'concurrency': 50, 'intensity': 'normal', 'rate_limit': 150, 'retries': 1, 'timeout': 5, 'fetch_gpt_report': false, 'nuclei': { 'use_nuclei_config': false, 'severities': ['unknown', 'info', 'low', 'medium', 'high', 'critical'] } } waf_detection: {
} screenshot: { 'enable_http_crawl': true, 'intensity': 'normal', 'timeout': 10, 'threads': 40 }
Can you post your celery-entrypoint.sh file? Only lines roughly 160 to 205.
Maybe it's because my full scan is misconfigured. Vulnerabilities can be scanned using the recommended configuration
I have the same issue. its happen on all my server, i use ARM VPS on Oracle and my Kali VM on my home PC
I dont use any proxy or VPN, any recomendations?
i found something here
there is file call urls_unfurled.txt i didn't find this file when opening docker volume, i think this is the source of the problem
Is there an existing issue for this?
Current Behavior
I scaned many targets but no vuln found while zap find many vulns.And the scan keeps in progress for 2 days.It is only 1 target.
Expected Behavior
I hope there would be at least 1 vuln
Steps To Reproduce
1、version is 2.2.0 2、target is perceptyx.com
Environment
Anything else?
No response