Closed pkel closed 6 years ago
I have had the same issue with the chrome extension site as well, which I fixed through urlcheck as well. I don't remember which domains were specifically at fault, but these are all the google domains I whitelisted:
ssl.gstatic.com plus.google.com google.com commondatastorage.googleapis.com play.google.com chrome.google.com googleapis.l.google.com accounts.google.com docs.google.com
Getting to the broader problem, do you have "recycle_old=1" in /etc/hostsblock/hostsblock.conf? That feature recycles your previous resulting blocklist, retaining old entries. It can be turned off by putting "recycle_old=0".
Provided that you do the previous step, hostsblock should only extract entries from lists found in the "blocklists=()" array in /etc/hostsblock/hostsblock.conf. Commenting out a list should prevent it from being processed.
From what I've seen thus far, I don't believe anyone has done package signing with external blocklists yet, though some appear to have checksums (e.g. http://winhelp2002.mvps.org/hosts.htm), which without signing would only help detect download corruptions. I doubt that adding that functionality would be particularly difficult, although I won't have free time to work on this until September at the earliest.
I just encountered a problem with
http://hostsfile.mine.nu/Hosts.zip
. It listsclients2.google.com
which is used by google to serve extensions for chromium. The url being blocked implies DoS on extension updates.To fix this I removed the block list from
hostsblock.conf
.hostblock-urlcheck
helped me to figure out, thathosts.block
even if cache is cleaned. This is due to hostsblock including entries from/etc/hosts.old.gz
I conclude that hostsblock never deletes any entry from the hostsfile. This may be a problem if
DoS via malicious blocklists is hard to deal with in general. Are there any approaches on signed blocklists, like it is done on package managers?