0xERR0R / blocky

Fast and lightweight DNS proxy as ad-blocker for local network with many features
https://0xERR0R.github.io/blocky/
Apache License 2.0
4.71k stars 208 forks source link

Wildcard list support #1090

Closed t-e-s-tweb closed 11 months ago

t-e-s-tweb commented 1 year ago

Hey guys! long time. Can someone let me know if blocky supports these kind of blocklists? These are very well made and has comments for the type of site and its name but blocky throws a lot of errors. Can you guys please test?

https://github.com/ShadowWhisperer/BlockLists/blob/master/Whitelists/Whitelist

kwitsch commented 1 year ago

As these appear to be simple domain lists they should work .

Could you provide some errors for reference?

t-e-s-tweb commented 1 year ago

As these appear to be simple domain lists they should work .

Could you provide some errors for reference?

I'll get back to you with logs today.

t-e-s-tweb commented 1 year ago

` , trying to continue count=1542 source=https://raw.githubusercontent.com/hagezi/dns-blocklists/main/whitelist.txt [2023-08-06 21:10:55] WARN list_cache: parse error: line 4209: 2 errors occurred:

, trying to continue count=1543 source=https://raw.githubusercontent.com/hagezi/dns-blocklists/main/whitelist.txt [2023-08-06 21:10:55] WARN list_cache: parse error: line 4221: 2 errors occurred:

, trying to continue count=1546 source=https://raw.githubusercontent.com/hagezi/dns-blocklists/main/whitelist.txt [2023-08-06 21:10:55] WARN list_cache: parse error: line 4239: 2 errors occurred:

, trying to continue count=1551 source=https://raw.githubusercontent.com/hagezi/dns-blocklists/main/whitelist.txt [2023-08-06 21:10:57] WARN list_cache: parse error: line 2: 2 errors occurred:

These are some errors from two of the whitelists`

github-actions[bot] commented 11 months ago

This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 5 days.

ThinkChaos commented 11 months ago

I think the best thing for wildcards would be to use a trie.
I got something working locally using dghubble/trie that uses 33 MB with the OISD big list. Seems reasonable when compared to regex (303 MB) and plain domain list (16 MB).

I'll try to get a PR open soon.