Closed ScriptTiger closed 6 years ago
flushdns cache via CMD
The script already does that automatically, but the problem some people were having with Steven Black's files are that the client cache handles large files slowly when parsing and re-parsing, etc., causing slow DNS resolution in general, such as a lot of lag while web surfing, etc. So I was just curious if any users here were having the same problem and if the "compression" option should be used to make the files smaller. Only a small number of users over there seemed to have run into that problem though.
I ran into this issue and a "compression" to 9 domains per line solved it.
@CalvinHub, do you use the Python script option to get the compressed format? Have you tried the compressed files here?: https://scripttiger.github.io/alts/
Troubleshooting this issue has been slow because not many people experience this problem, but if you have time can you test out the compressed format at that link and let me know if it also works for you? If it does, I can update the AutoUpdate to include it as an option so it works for people that are having that issue.
this sometimes happens or most of the times to me only after booting up the computer and then directly opening a web browser, it takes like 30 seconds or more to load/resolve the browser homepage, I was thinking about if someone can suggest an additional layer of software (for windows mainly) that works with the name resolving mechanism that blocks/redirect hostname requests to 0.0.0.0 based on pattern matching, since the hosts files on windows and unix don't support regular expressions/wildcards, since this seems a sign of the blacklisting of ads and unwanted hosts at hostfile level is getting out of hand or suffering because the amount of hosts specified on them is getting too big
@dakd2, can you also test the "compressed" format and see if it solves your problem?: https://scripttiger.github.io/alts/
@dak2, I am not sure about full REGEX, but wild cards are pretty common in many DNS forwarders/servers. DualServer supports wild cards. If you use a DNS forwarder/server, you also don't need to worry about the limitations Windows might have with parsing the hosts file. You can keep your hosts file empty and just keep your blacklist in your DNS forwarder/server. You can also point all of the devices on your network to use this, including your phones and tablets, etc.
I think proxy servers are more common to have full REGEX matching/filtering. Some proxy servers can be intense on your machine, so if you just need it for simple REGEX blocking, get a non-caching proxy server. I think Privoxy supports REGEX and I have used it before, but I haven't used REGEX filtering with it. Again, you can also configure all your devices on your network to go through a proxy server to share the same blacklist.
If you are using Chrome, there are many extensions you can get with REGEX matching/filtering. If you use extensions, there is literally endless ways to accomplish this. You could use a simple find and replace extension to do REGEX matching and rewrite references to blocked websites. You could also get a developer extension to redirect hosts based on header information, which is common for developers testing vhosts. There are also extensions specifically for blocking. If you use DuckDuckGo, it blocks many things automatically and is effective for most ads, trackers, malware, etc., but it doesn't give you the ability to use a custom blacklist. Also, if you use a blacklist with a browser extension, it will only block things in your browser and not protect your entire system.
I just changed the executable in the unified hosts autoupdate scheduled task on windows to the updateHostsWindows.bat script in the steven black hosts branch and added the compress option when the bat script calls the python script and the extensions I use, now I have to wait one or more days to see how it works
Thanks, @dakd2, for testing this out. Let us know what happens. If this does solve your problem, I'll update my current script and hopefully you can test it out for us to make sure it's working to solve that issue, as well.
@ScriptTiger I think these commands can help.
ipconfig /flushdns
ipconfig /all
ipconfig /release
ipconfig /renew
netsh winsock reset
netsh int ip reset reset.log hit
I found it in one of my old backups.
@ScriptTiger A different way.
VBS FILE:
If WScript.Arguments.length =0 Then
Set objShell = CreateObject("Shell.Application")
objShell.ShellExecute "wscript.exe", Chr(34) & WScript.ScriptFullName & Chr(34) & " Run", , "runas", 1
Set WshShell = Nothing
Else
Set oShell = WScript.CreateObject ("WScript.Shell")
oShell.run "cmd /k ipconfig /flushdns & start /wait net stop dnscache & start /wait net start dnscache & echo. & exit", 0, True
Set oShell = Nothing
End If
I've just added an "mcompressed" format to the "additional blacklist support" page: https://scripttiger.github.io/alts/ This pre-generated version does contain compressed comment lines, but the final script that will run locally on your machine will have an option to remove those, so don't worry about that.
@dnmTX has suggested that in his personal case 5 domains per line is better than 9. @dakd2, @CalvinHub, maybe you can also test that out and see if you notice anything. Originally I was just going to add support for a fully compressed option, but if there really is a difference in varied cases, I'll support a compression level option from level 2 to 9, being the maximum number of domains per line.
@ScriptTiger "netsh winsock reset" will require restart........"netsh int ip reset reset.log"- i wouldn't mess with this one at all,last time i did that i remember it messed up something and eventually had to reinstall the OS. About the DNS Client: This is what i'm using at the moment- https://support.microsoft.com/en-us/help/318803/how-to-disable-client-side-dns-caching-in-windows-xp-and-windows-serve which is the middle ground between having it on(aka slow) and off(still slow....in my case).It's a good workaround for using it with large hosts files. @dakd2 @CalvinHub if you don't mind test the file with 5 lines,i'm curious also.Thanks
Since this project uses Steven Blacks' hosts files, I'm not implementing anything other than the standard "ipconfig /flushdns" that's been there from the start. The point of using these particular hosts files is that they don't need to be ridiculously big with massive numbers of dead hosts taking up resources that don't need to be there. They are actively curated and currently among the most efficient and streamlined lists out there.
@dnmTX, have you ever considered running an internal DNS service, like DualServer, dnsmasq, etc.? If your system is really that bad, I think using third-party software might be the answer. With an internal DNS server, the server handles the blacklist and not your OS, so the resources are hopefully managed better than whatever it is Microsoft is doing on your machine. A bonus would also be you can point everything else on your network to your machine as a DNS server so your phones, tablets, laptops, etc, can all share the same blacklist and keep them up to date together. That link I posted above has lists for several DNS server formats, as well: https://scripttiger.github.io/alts/
And there's also a dedicated DualServer page since that is what I officially support: https://scripttiger.github.io/dualserver/
Thanks @ScriptTiger ,ill look in to it but let me say this: My system is not "that bad" i'm just trying to achieve the maximum possible performance(considering i have antivirus,adblocker and anti-exploit running in the background which slowing the browsing speed also,a little bit but still...),and it's Windows,don't forget that.There are not too many Windows users here overall(otherwise you would've read much more complains,trust me) but as you see from the one that are here there is always something going on. I'll look in to the links you provided and start testing,will let you know about the result later on.
If you try DualServer, let me know how it goes. If it doesn't work for you, you may need to try an older version. I haven't checked the newest "stable" recently, but they have released some poor "stables" in the past and I am actually running an older version. I retry it every few months though. My trust of SourceForge has also been a bit shaky in recent years after those illegal project takeovers they did, injecting adware without permission into people's stuff. I may end up hosting a known good version of it myself personally just to make sure people aren't getting a bum deal. I really do enjoy the version of DualServer I am running a lot, but I don't really feel 100% comfortable endorsing all of their recent stuff.
Hostman already rearrange hosts per 9 lines, I think I will give up fixing this thing and come back to Win 7.
Did you also try 5 per line?
As stated above, you can also try a DNS service solution, like dnsmasq, DualServer, etc. The blacklists for those are available here: https://scripttiger.github.io/alts/
I would say that for me this problem is somewhat hard to spot since it seems to only happen one single time after cold booting my computers. I think even the delay in resolving hostnames resolves as some kind of background process wether I browser the internet or not after as I avoid as much as I can restarting or turning on/off my computers through the day
also I the computer where I replaced the scheduled task script unified hosts autoupdater with updateWindowshosts.bat with the --compress option doesn't seems to experience this problem anymore, I only have tested this in one computer
for some reason I think I am able to reproduce this issue by modifying google chrome policy settings, I accidently ran into this by turning off and then on the policy that can force google chrome to run always incognito mode I still don't know why it causes this, when I did that for first time I noticed that other browsers and the ping program would have that same huge delay resolving hostnames
Are you referencing the registry policy "IncognitoModeAvailability"? If this works to simulate the problem, it would definitely help so I could just test my own lists rather than hoping.
I have officially implemented compression levels into the script, so I am calling this one case closed. Please open new issues if the new functionality has problems.
Are any users currently experiencing the DNS client cache issue described in https://github.com/StevenBlack/hosts/issues/411? If so, can you manually try the new fix/workaround, https://github.com/StevenBlack/hosts/pull/459, and give your feedback here? I'd like to get some more use cases on it before implementing a similar process in the AutoUpdate script here.