Closed smokingwheels closed 6 years ago
@smokingwheels I agree with you. I'm waiting for this feature request to be implemented.
And what makes me sad is that some users block legit domains and they blame Pi-Hole software for the problem. Currently whitelist.txt
is the best way to avoid these. And I will be updating it regularly if I find any new domains.
Also the main limitation is that I can't analyse websites which are geographically restricted.
The whitelists. I have a QB64 program for each of my blocklists that unlists some domains if a user lets me know or ones that I find around the place or domains I have found myself. I think it is better because I don't end up with over 250 on the whitelist in the Pi-Hole.
You can open your file look at each line and test the contents with lots of these and skip the output if it is listed. IF RIGHT$(a$, 21) = "discourse.pi-hole.net" THEN GOTO whitelist IF RIGHT$(a$, 11) = "pi-hole.net" THEN GOTO whitelist https://www.qb64.org/ There is a forum for QB64 http://www.qb64.net/
The sites that are websites which are geographically restricted we rely on other users for help or information.
Your feature request looks promising. However I haven't used Yacy search engine before, I will look into it...
The tests we perform as individuals with out knowing what lists a Pi-hole user has on a blocklist or whitelist and weather they want to make it Public?
I test my lists locally first with what I have been able to find and run a Yacy Search Engine to crawl a site to depth of 6.
I had a Google blackout. see https://discourse.pi-hole.net/t/do-we-need-a-developer-blocklist-for-individuals-who-maintain-lists/5381 The ripple effect like a bolt of lightning. I have Whitelisted and published back to github in a record of 11 mins of me knowing about it.
I used most of @Wally3k lists on my first list attempt. I have a second list.