Closed borisneubert closed 7 years ago
@borisneubert what exact categories did you select? I added adult, which results in the following entries containing "spiegel":
grep spiegel /usr/local/etc/squid/acl/UT1_blacklist
.spiegel-sex.de
.spiegelzimmer.de
.liebesspiegelchen.com
.eulenspiegel.biz
Looks good to me.
Next question, which version are you using.
OPNSense is 17.1.2 blacklist is ftp://ftp.ut-capitole.fr/pub/reseau/cache/squidguard_contrib/blacklists.tar.gz as of 26 Feb 2017 I confirm your findings.
I found the reason: PEBKAC. It does not suffice to press the Apply button after making changes to the blacklist settings and pressing the Save Changes button in the popup window. One also has to press the Download ACL button.
I did this repeatedly wrong before opening this issue at GitHub. Sorry for bothering you and thank you very much for the superfast reply.
May I make a suggestion? Maybe I am not the only one to miss the correct workflow. Wouldn't it be simpler to use if the ACLs get automatically created
Kind regards Boris
Hi Boris,
Thank's for your feedback, maybe we could extend the help text a bit to make the workflow more clear. The problem with automatic downloading is that it might download too often, which in some cases could result in your ip being blocked by the content provider.
Best regards,
Ad
Hi Ad,
does this mean that the remote blacklist is downloaded every time the "Download ACL" button is pressed? Yes, that's what the documentation says although I interpreted this as downloading the ACLs to Squid.
Wouldn't it be more easier to separate the update/retrieval of the list and the construction of the ACLs? I previously used IPFire that offers two separate workflows:
For starters it is probably easiest to extend the help text and the documentation a bit.
Kind regards, Boris
Hi Boris,
Yes, download refreshes the data, there's no local cache available. Given the limited time I have at the moment I will add a notice when leaving the dialog. We're using the standard squid acl options, which function fine, but are a bit picky on how they should be delivered (sorting, de-duplication, etc). Even when the download would be cached, it would still require almost the same amount of time to rebuild the new acl based on the selection in most of the cases.
Best regards,
Ad
Followed the instructions at documentation to setup web filtering with the blacklist of the Université Toulouse.
I noticed that .spiegel.de appears in the xx category although it is in the press category in the downloaded blacklist. .eulenspiegel.de which is in a different category is just before.
I assume that the conversion does not correctly create the acl.