Cimbali / CleanLinks

Converts obfuscated/nested links to genuine clean links.
https://addons.mozilla.org/en-GB/firefox/addon/clean-links-webext/
Mozilla Public License 2.0
76 stars 2 forks source link

Custom/default rules situation #98

Closed birdie-github closed 4 years ago

birdie-github commented 4 years ago

It's currently very hard to ascertain which rules come by default from you, the extension author, which default rules have been altered (i.e. have queries added/removed) and which rules are completely custom and created by the user.

birdie-github commented 4 years ago

What if instead of removing predefined query parameters each had a checkbox next to them, so that the user could simply enable/disable them while leaving them essentially intact?

That way it would be easy to sync the user changes with your updates.

birdie-github commented 4 years ago

And it would be great to have the same in regard to rules/domains, i.e. you have a checkbox next to them and can enable/disable them easily.

This way the user can keep up with your changes while simply whitelisting (removing a checkbox) for their favorite website.

Cimbali commented 4 years ago

So you’re suggesting to be able to tick for each rule and for each rule item (or for each “Cleaning Action”) whether the rule is applied or not, is that right? That’s doable, I think, but I’m not sure it will actually solve a problem, because rules are more of a whitelist than a blacklist.

Most (all?) of the breaking that occurs due to CleanLinks is not because of wrong rules, instead, it comes from missing rules.

We automatically detect embedded URLs, which are used either

  1. when websites report your current URL, or
  2. when they bring you to an intermediate page to track you and then redirect you to their destination.

These requests are then respectively dropped (we could also consider removing the query parameter containing the current URL) and redirected to the embedded URL.

The rules are there to remove further tracking parameters (e.g. utm_*) and to whitelist the legitimate uses of embedded URLs (e.g. disqus needs to fetch the comments for the current page, so it makes sense to use the URL).

So when a rule for a legitimate usage is missing, the website may break until whitelisted.

birdie-github commented 4 years ago

So you’re suggesting to be able to tick for each rule and for each rule item (or for each “Cleaning Action”) whether the rule is applied or not, is that right?

Exactly.

That’s doable, I think, but I’m not sure it will actually solve a problem, because rules are more of a whitelist than a blacklist.

If I'm not wrong most website currently break because of your rules/rules items which could be unchecked and then you have everything sailing smoothly.

Maybe I'm wrong completely however.

birdie-github commented 4 years ago

I guess you're right and this request makes no sense.