LukeSmithxyz / based.cooking

A simple culinary website.
The Unlicense
2.2k stars 543 forks source link

How about a chromuim/firefox extension to remove bloatware from webpages? #285

Closed adeptflax closed 3 years ago

adeptflax commented 3 years ago

I know this is off topic, but I felt like I should post it here. I would work on it if you guys think it's a good idea. I'll create a github repo for it. What sites should be added first?

ghost commented 3 years ago

Your time is better spent: 1) Using Brave or other off-the-shelf ad blockers 2) Simply not visiting these webpages, if possible. 3) Creating a better solution ex nihilo (Many such cases). This results in less friction and a better end result.

adeptflax commented 3 years ago

I think sites with articles shouldn't be too hard. There's a some method of getting the article body programmatically that works on most sites. Most other types of sites would have to completely rewritten or require a custom solution for that site.

taco-c commented 3 years ago

If you think it's a good idea, go for it. You don't need our (or anyone's) approval.

Also a general tip: it's easier to convince people to your idea if you already has something made already, so they can see it working.

MFG080xc0 commented 3 years ago

most websites actually don't need JS to work. Install Umatrix and globally disable Scripts, XHR requests and Cookies but leave out CSS and images and such. Most websites as I've said will render and work just fine or they will render out enough that you can make out the content, the ones that don't render at all are probably complete garbage anyways. + Plus you can block all the Facebook and Google tracking with Umatrix so there's also that.

Also configure (read as fix) your browser against fingerprinting and useless modules and so called features, for example web assembly or support for VR headsets [lmao]

adeptflax commented 3 years ago

I thought about it some more. A tool that grabs an article and just displays the actual content shouldn't be too hard to make I think. There could be a server that downloads an article of a given url and just returns the useful information and resizes the images. The server would process the javascript for your computer, so even content behind javascript would work. So get like 99% bandwidth savings.

hallzy commented 3 years ago

Isn't this basically the same thing as reader view. I think Firefox has one built in, and if Chromium based browsers don't I'm sure there are reader view extensions already.

adeptflax commented 3 years ago

I don't know whether or not the browser actually downloads the content. I'm going to guess that it does.

sylGauthier commented 3 years ago

Use noscript, it's the best solution cause you can re-enable it for specific sources if it's really required.

Also, use a proper hosts generator to completely ban all bad domains. Allow me to shill my own solution that lets you carpet ban domains of your choice with a simple Makefile and a based modular structure. For malware/adware it evens pulls a bunch of fresh lists from different sources.

MFG080xc0 commented 3 years ago

I'd advice against using noscript due to the lack of flexibility that it has when it comes to handling different domains and because of the very brief information of elements which dooms someone into accidently running JS and overall makes it very brain damaging and complex to the user long-term. Umatrix does an overall better job at handling domains due to the scope system that let's you override and set parameters globally or otherwise, plus it provides better visual feedback of what is going on with scripts and such

ndren commented 3 years ago

Since uMatrix is archived, consider uBO medium mode as a great alternative. (Official guide, unofficial guide that worked fine for an introduction.)