Open mikhoul opened 5 years ago
I see 3 options:
I'll explain in detail later.
I would guess Edge will closely follow Chromium, so that's probably not going to work. What about Firefox? Well, I'll explain later.
Option 1: Use the new API
If the new API is really weak, then I don't think Firefox can survive for long. If websites can more easily get around the weak APIs of Chrome, then website owners will have a motivation to push everyone to use Chrome. And soon Chromium-based browsers will probably become the only usable browsers.
If the new API is OK, then converting to the new API is probably going to be the easiest solution.
The currently proposed APIs are on the weak side. If the 30k limit is lifted and rules can be dynamically modified, then I would say it's OK.
In the other thread, gorhill listed his concerns:
important
optioncsp
and inline-script
optionsThe rules count limit is a legitimate concern (one of my two primary concerns mentioned above), as EasyList alone has more than 80k rules.
As for dynamic filtering, I never use it. To be honest I don't even know how it works. So as far as I'm concerned, that's not an issue.
When it comes to the important
option, I think it's possible to implement with more than one extension. Each filter becomes an extension, one extension to enforce all important rules, and one master extension to help with whitelisting and management.
Note that filters maintainers will probably have to publish their own filters as extensions then attach them to the public APIs of the coordinator. It's not going to be pretty, but theoretically possible.
For the csp
option, I think there'll be some way left to modify headers. If not, it can still be implemented using content scripts (script snippets). The inline-script
option can be (AFAIK, is) implemented using csp
.
For extended syntax, these are a few notable ones:
badfilter
optionimportant
, csp
, and inline-script
options (covered above)For cosmetic filtering (normal and extended), pretty much everything is implemented in content scripts, which are not part of the manifest v3 proposal, so I would assume they are not going to change.
The new APIs will actually make script snippets injection better, as the proposal contains new APIs that allow extensions to dynamically modify content scripts.
Same thing for redirection, (the current draft of) declarativeNetRequest
supports redirect rules natively.
The badfilter
option is resolved in compile time, so it doesn't matter.
The only thing missing is RegExp rules (and HTML filtering if you want to count that). I don't think it'll be a problem because neither uAssets (nor me) really use RegExp rules. On the other hand, overly broad RegExp rules from EasyList are constantly causing problems.
Option 2: Fork Chromium
I would assume code for manifest v2 will eventually be removed, so the forking process will eventually become not a simple flag flip. Maintaining a browser fork is really difficult in general, and vulnerabilities can be opened if not careful.
Another way is to use Electron to build a new browser. Unlike WebExtension, Electron provides really powerful APIs, which are unlikely to go away. However, the amount of work to maintain a browser is likely more than that of a fork, but since Electron apps use JavaScript and not native code, vulnerabilities are less likely to occur.
Also, implementing WideVine (DRM) will be really difficult in either case.
Option 3: Make a native app
There's a shell there's a way. I'm sad to see that there's no open source native app that does more than DNS blocking.
A native app is the best way to get around the constantly changing browser APIs, and as a bonus, it'll probably work for all browsers and even non-browser apps.
I'm not sure how hard this will be though, last time I tried it didn't go so well.
I probably will go with option 1 or 3. As for the timeline, I don't think it'll happen within 12 months. Right now, the APIs are not released yet, and the phase out period will likely take more than a year.
I think it's a good time to get started on option 3, if things go wrong, option 1 can be used as fallback.
First thanks for your very detailed answer. :+1:
The problem I see with option 3 if my understanding is fine is that a native app would not be able to remove some objects like cosmetic filters are doing ?
For me a native app sound like when I use AdAway on android, I'm pretty sure I'm missing something here about how would work a native app π€ :question:
I would miss "Dynamic filtering" since I use it occasionally to un-break some filters that are good but that on specific website cause problems.
I don't think Firefox will disappear, it can lose users but Google will always keep Firefox alive as a caution to prevent a split of their business by US Gov (like Microsoft have done with Apple).
IMO if Google weaken their API for blocking ads it could revigorante Firefox with power-users and gain 2-3% of users. When I migrated from Firefox to Chromium I dragged with me ~150 users and those users today recommend Chrome/chromium to their friends and I'm pretty sure I'm not alone.
But Mozilla management is crazy those day they removed so many things/features that I migrated to Chromium to stop from suffering from the many memory leaks in Firefox. So in one way it would not surprise me that they do some crazy thing in the future like following the very same Chrome API unless the management at Mozilla change.
For the fork I was thinking it was more easy than you said it is to keep the old API, like some module you glue to the main code when you build chromium and that don't require lot of maintenance. I've seen Slimjet a Chromium fork and with an integrated ad-blocker so I was thinking it was "easy" to integrate in a fork.
Time will tell. :octocat:
native app would not be able to remove some objects like cosmetic filters are doing
Again, there's a shell there's a way. Native apps are (theoretically) able to do cosmetic filtering on their own as they can patch everything the browser receives, however, I do agree that for cosmetic filtering, it's easier to implement using extensions.
There's nothing stopping native apps from loading extensions into browsers, so if needed, the native app can have a companion extension.
sound like when I use AdAway on android
AdAway is a DNS manager, it is really limited in what it can do. I'm talking about native apps that can repack HTTPS requests.
but Google will always keep Firefox alive as a caution to prevent a split of their business by US Gov
Possibly.
But Google can also point to those 20+ Chromium forks and say their code is under a permissive license, and everyone is free to fork and compete.
For the fork I was thinking it was more easy than you said
I never tried so I don't know for sure. Maybe you're right. If you have more experience with the Chromium code base maybe you can start a Chromium fork?
Actually instead of trying to reinventing the wheel and fail miserably, I'll probably just take a wheel off the shelf.
I'll probably still need a JavaScript runtime though, as RegExp in filters are JS RegExp. For a prototype it'll be fine, if it works out I'll worry about optimization later.
Thanks for your complete and detailed answer it's enlighting and it remove interrogations I had. π
So it would be a little bit how AdGuard work, I never used AdGuard but my understanding is that there is a component that work at the OS level and another part as an extension for better efficiency.
If you have more experience with the Chromium code base maybe you can start a Chromium fork?
I don't have experience but it is something I 'm thinking about for 2-3 months to be able to modify the Chromium GUI and keep it tight, maybe I will give it a try this year, this issue will ad some motivation make this project higher in my priority.
Regards :octocat:
Another way is to use Electron to build a new browser. Unlike WebExtension, Electron provides really powerful APIs, which are unlikely to go away. However, the amount of work to maintain a browser is likely more than that of a fork, but since Electron apps use JavaScript and not native code, vulnerabilities are less likely to occur.
I don't really know about Electron, but I remember this talk about building a web browser in Electron. The security engineer actually recommends not try to do it. Their company tried it and it didn't go so well.
Option 1: Use the new API If the new API is really weak, then I don't think Firefox can survive for long. If websites can more easily get around the weak APIs of Chrome, then website owners will have a motivation to push everyone to use Chrome. And soon Chromium-based browsers will probably become the only usable browsers.
Are you saying that websites will simply block Firefox users entirely if Firefox insists on keeping the existing webRequest API?
Option 3: Make a native app There's a shell there's a way. I'm sad to see that there's no open source native app that does more than DNS blocking. A native app is the best way to get around the constantly changing browser APIs, and as a bonus, it'll probably work for all browsers and even non-browser apps. I'm not sure how hard this will be though, last time I tried it didn't go so well.
For this to work it would need to support HTTPS interception/rewriting which while it can be done opens up a huge can of worms security wise, especially if it is not done properly. It would also require having a companion GUI app for creating and installing the custom SSL certificate used for said interception as most people aren't going to be able to do it on their own using the standard command line tools. Even worse, on Android by default apps will not trust user installed root certificates (Chrome does but many other browsers and apps do not) and will need to be patched (unless the device is rooted, then the certificate can be installed to the system certificate store either directly or with a Magisk module).
@jspenguin2017 I have my answer for a fork on chromium to bypass a weak API if it was needed it'S possible and Brave seem to already use this strategy:
A developer on the Brave team was explaining it on Reddit:
_Bat-chriscat BAT TEAM[M] 34 points 1 day ago* RemindMe!
It's worth noting that our Brave Shields (ad blocker) is not an extension; it is natively implemented. So extension API changes leave our shields unaffected.
Edit: We can always remove any code or update we don't like from the Chromium base we use. So even if this didn't just affect extensions but something deeper, we could just exclude it._
So maybe if they already do that you could fork Brave to integrate natively Nano inside and bypassing the API to intercept the calls you need ? π€
This way it would be future proof and the control would be very granular to block everything. User would just install the browser and use it. βΊοΈ
Regards :octocat:
@jspenguin2017 You might want to checkout Proxydomo hosted here on github. https://github.com/amate/Proxydomo
It's a local proxy with https filtering capability. I've been using it to strip out among other filters of blacklisted pages before it even reach the browser. It's a reverse engineered program of the popular Proximitron in the early 2000s.
There's also a recent separate effort called Proximitron Reborn, but I think the author of that hasn't release the source she promised yet. On going discussion of it is on the ancient prxbx forum here: https://prxbx.com/forums/showthread.php?tid=2331
Just to let know user that even the security guys at Mozilla seem to think it's a good Idea to weaken the API for extensions: https://old.reddit.com/r/firefox/comments/aithmh/raymond_hill_creator_of_ublock_origin_ubo_and/eerce78/
Firefox seem to be just an inferior "Chrome clone" right now with no real differentiation and no advantage sadly.
Keeping a strong and better API than Chromium could have been their Hail Mary but hey it's Mozilla as usual AKA stupid.
Regards :octocat:
@onmyouji The security engineer actually recommends not try to do it.
Yes, I'm aware that a browser based on Electron is hard to secure, but so is extending a browser using native code.
As I stated before, I have no plan in pursuing this option. Thanks for the information though.
@nl255 Are you saying that websites will simply block Firefox users entirely if Firefox insists on keeping the existing webRequest API?
That's a possibility.
For this to work it would need to support HTTPS interception/rewriting which while it can be done opens up a huge can of worms security wise, especially if it is not done properly.
I'm aware of it, that's why I want to use a well established project to handle the HTTPS repacking part.
If done right, HTTPS repacking can be stricter security wise, as you can choose to drop support for weak encryption before major browser vendors do.
Security is hard though, so we definitely need to be careful about this.
It would also require having a companion GUI app
We'll worry about that part later. That's going to be easy compared to the rest.
Even worse, on Android by default apps will not trust user installed root certificates
I thought that's only for apps that use certificate pinning. Either way, let's get it working for desktop Linux and Windows first, we'll worry about Android later.
Something that passed under the radar this autumn Mozilla want to follow V3 manifest too: https://blog.mozilla.org/addons/2018/10/26/firefox-chrome-and-the-future-of-trustworthy-extensions/
So the comment of Mozilla on Reddit is logical: https://old.reddit.com/r/firefox/comments/aithmh/raymond_hill_creator_of_ublock_origin_ubo_and/eerce78
@mikhoul it'S possible and Brave seem to already use this strategy
I knew it's possible, and I knew Brave is doing it. That does not change the fact that coding in native code comes with risks, especially when the native code interfaces with untrusted inputs (web traffic).
@joey04 (from other thread) But a browser fork is a massive effort.
It doesn't have to be. As you mentioned, the hard part is already taken care of by major tech companies and their paid developers. All we need is to insert our code at the right place and keep the fork up to date with upstream.
There are 2 problems though:
I won't say it's impossible, but it's hard enough that we want to explore other possibilities first. It's not necessarily a massive effort, but it'll probably be a massive paperwork.
@nhantrn
Noted. Thanks for the information.
@jspenguin2017 commented on 23 janv. 2019 Γ 22:22 UTCβ5:
it'S possible and Brave seem to already use this strategy
I knew it's possible, and I knew Brave is doing it. That does not change the fact that coding in native code comes with risks, especially when the native code interfaces with untrusted inputs (web traffic).
@joey04 (from other thread)
But a browser fork is a massive effort.
It doesn't have to be. As you mentioned, the hard part is already taken care of by major tech companies and their paid developers. All we need is to insert our code at the right place and keep the fork up to date with upstream.
There are 2 problems though:
- The code that we insert probably have to be native code, which can introduce new vulnerabilities
- It's hard to get WideVine (DRM) working, also, Chromium depends on Google API a lot, you can get the API keys for your personal use fairly easily, but if you distribute your build the quota will run out pretty quickly
I won't say it's impossible, but it's hard enough that we want to explore other possibilities first. It's not necessarily a massive effort, but it'll probably be a massive paperwork.
I'm not sure on how hard it is for Widevine since https://chromium.woolyss.com/ have all the version build in almost all flavors quickly since many years with Widevine, if it was complicated they would not distribute it so widely.
But I understand that other API have quotas limit or a restricted but I never suffered from it but like I said I don't use the sync.
I browsed quickly https://chromium.woolyss.com/#google-api-keys and found it's relatively easy to install API key if you really need it, it seem it's a thing you do once and forget about it. π
I would really like a Chromium fork for power-users on desktop with a more compact GUI with less padding and above all with Nano integrated.
I know it's not the subject but take a look at CentBrowser they added lot of features for power-users. but he is closed source that's the downside but I use it with other forks.
Tomorrow I will ask to the CentBrowser team what is their take about V3 Manifest and restriction, they are a small team (~3 people) but they are responsive and open and I will report.
Regards :octocat:
@jspenguin2017
Even worse, on Android by default apps will not trust user installed root certificates
I thought that's only for apps that use certificate pinning. Either way, let's get it working for desktop Linux and Windows first, we'll worry about Android later.
Nope, it applies to Nougat and above. Even worse, there is no reason to think that Chromium and Firefox won't do something similar as they have already started heading that way with DNS over HTTPS (which also conveniently gives google access to all your DNS requests even if you run your own DNS server). So you might end up having to make either a fork or possibly a program loader (since DLL injection is blocked that leaves a loader program that patches the browser in memory) but of course with a loader they could fight back by using various tricks to make pattern based smartpatching fail possibly even including a Denuvo style custom virtual machine.
.
even with all the harmful decisions firefox is still the better choice. i think it is more likely that firefox at least keeps the api somewhat better than chrome. and even if not, there are also firefox forks like for example https://github.com/intika/Librefox/issues which aims to create an always up to date version with an automated build system
Even with the potential issues mentioned earlier I would think the native, standalone app would be the best choice as it will be far easier to maintain a fork that merely makes SSL interception easier than it would be to reimplement the existing webRequest API especially if the browser code ends up being refactored to make it harder to do so as short of going proprietary or heavy code obfuscation/encryption it will always be easier to comment out a few functions regarding SSL certificate checks or make them always return "true" even if the check failed than it will be to keep the webRequest API working.
standalone app would be the best choice as it will be far easier to maintain a fork
I'm not sure about it at the end you will need an extension to be able to select element on the fly and create filters, even AdGuard that use a stand alone app need to have an extension and the dev already stated the such change in the API will hurt AdGuard.
@joey04 (from other thread) But a browser fork is a massive effort.
It doesn't have to be. As you mentioned, the hard part is already taken care of by major tech companies and their paid developers. All we need is to insert our code at the right place and keep the fork up to date with upstream.
"All we need" eh? :-)
Kidding aside, I understand what you mean. Perhaps time-consuming is a more accurate term for a Chromium fork with the level of custom code we'd want for robust content filtering.
Just building a Chromium pull alone can take an hour or more depending on your CPU. And in the long run, there's the possibility of Google changing something internally that would impact an integrated custom filter engine. That could be a nightmare to deal with.
So I'm still of the opinion that modifying the Brave engine, which they generously share as MPL, is the best approach. It's already integrated into Chromium, and if Google breaks that interface in some way, the Brave devs will have to deal with it.
As for the other proposed idea, intercepting security certs sounds too much like SuperFish. I don't think I'd be comfortable with a benevolent FOSS one.
I'm not sure about it at the end you will need an extension to be able to select element on the fly and create filters, even AdGuard that use a stand alone app need to have an extension and the dev already stated the such change in the API will hurt AdGuard.
It could be done without an extension, it would just be uglier and depending on how it is done would likely require customization for each browser and major browser revision. Most browsers have an inspect element right click function which will bring up the source for the element in the developer tools which could then be OCRed via screen grab and compared to the page source captured by the standalone app. Or possibly by injecting custom javascript (AJAX?) code that communicates with a locally running server to get around the lack of sufficient APIs whenever the option to choose an element to block is selected. Or worst case the user would have to find the element using the inspect element and create the filter on their own (meaning only advanced users who know html would be able to block elements on their own). Though that could probably be worked around by using a fork that has dll injection detection/prevention disabled combined with an injected dll (or equivalent for other operating systems) meaning you would have to install Nanofox or Nanonium to be able to create your own element filters through a point and click interface.
It could be done without an extension, it would just be uglier and depending on how it is done would likely require customization for each browser and major browser revision. Most browsers have an inspect element right click function which will bring up the source for the element in the developer tools which could then be OCRed via screen grab and compared to the page source captured by the standalone app.
At his point it would be lot more effort and time that forking Chromium.... π΅
Also most users are not interested in opening Dev tool to hide an element, it's not convenient at all. I do that myself but only for really complex ads, 99% of the time I use the selector mode of Nano.
The goal of @jspenguin2017 seem to be to lower the load as much as humanly possible so i'm really not sure.
Also @joey04 have a really good argument it's really not everybody that would be willing to install a root certificate for an Adblocker I don't even trust AV to install a certificate.
Regards :octocat:
Sorry guys, I'm quite short on time these days. I'll respond to your comments eventually, but it can take some time.
Don't get too worked up on this, it'll be fine either way. Worst case I'll roll a native app and bypass all the arbitrary WebExtension restrictions.
Note that I have no plan in forking Chromium, securing a browser is really difficult, and I don't want to risk it unless there's no other options. If anyone wants to start a Chromium fork, feel free to do so. Test out your ideas and find potential issues, that's going to benefit everyone here.
@mikhoul if it was complicated they would not distribute it so widely
I would guess the paperwork is a one-time thing. So even if it's really complicated, once you get it through you can distribute your browser widely without issues.
it's relatively easy to install API key
It's not bad, but I'll still consider it to be a problem if everyone who downloads the browser have to do it.
@nl255 DNS over HTTPS
Since we are going to repack HTTPS, we probably can inspect and alter DoH responses just fine.
Desktop browsers do accept custom certificates and I don't think that's going to change anytime soon. There's no need to patch browsers in any way... For now.
If you're talking about Android, then that's another story, honestly I'm not too sure what to do. Again, let's focus on desktop first.
@mikhoul even AdGuard that use a stand alone app need to have an extension
Yea, it's probably going to be kind of tricky to implement cosmetic filters with just a native app. Script snippets injection is probably better handled by extensions too.
Parsing and altering server responses are going to be hard and heavy, not to mention CSP that can sometimes be in the way.
@joey04 Perhaps time-consuming is a more accurate term for a Chromium fork
By "all we need" I meant that it's kind of a one-time task. Once everything is in the right place, the maintenance task should be minimal.
there's the possibility of Google changing something internally that would impact an integrated custom filter engine
If done right, I don't think it'll be a problem. We can hook into declarativeNetRequest
and process filters on the spot synchronously with native code, as long as there's some sort of filtering system left, we should be good.
So I'm still of the opinion that modifying the Brave engine, which they generously share as MPL
I mean, nothing is stopping you from getting started on that.
Personally I'll think twice before touching Brave's filtering engine. It is in native code and is really over-optimized. If done wrong, you'll break something at best, or open a vulnerability at worst.
I don't think I'd be comfortable with a benevolent FOSS one.
But you're fine with a custom browser? I think a custom browser is more risk than a MITM proxy. I plan on writing the proxy in JavaScript or Python, which are memory safe. So as long as the code is not malicious, you'll likely be safe. To patch a browser, we'll need native code, which can open vulnerabilities that can be exploited by websites you visit.
@nl255 It could be done without an extension
Considering the native app can control everything the browser receives, it can definitely be done without an extension. However, it'll be horrifyingly hard, as we'll have to patch websites on a case by case basis.
@mikhoul I don't even trust AV to install a certificate
Considering AVs usually have companion kernel drivers, you're probably better off just not using an AV.
If your fear is that installing a root certificate can be dangerous, then using a custom browser is likely to be more dangerous.
My goal is to find the best solution for the problem, not to please the most users. I care about security, and I'll make decision based on my understanding of security instead of common belief.
@jspenguin2017 Thanks for and I understand you can be short in time so no worry.
Right now the Brave team seem really be interested to keep the webrequest API as it is with without neutering it. They stated it clearly on Reddit, it seem to be the same for CentBrowser so I'm pretty confident that at least a fork will keep the webrequest API retro-compatible with Manifest V2 as it is today.
The main problem right now is that all those dev and everybody don't really know what will be in Manifest V3 at the end so it's hard for them and everybody to begin to plans on how to keep/protect the API as it is and see how to do it technically speaking.
Also personally I could be wrong but I don't think in the long run Mozilla/Firefox will keep the webrequest API as it is today, they will follow Google path "in the name of security" ... they already done it with xul and they promised to port old APIs and after the said no it would breach security. In short I don't trust Mozilla at all sadly.
I will follow closely other Chromium forks and report here informations about what they will do with the webrequest API once Google will have set V3 in stone.
Wishing you a nice day ! π
Test out your ideas and find potential issues, that's going to benefit everyone here.
We'll have to agree to disagree on whether a browser or MITM approach is best, but I certainly agree that at this preliminary stage it's good that different people are exploring various options. I've decided to keep an open mind because there are too many TBDs at this point to put all my eggs in one basket.
I've starting looking into Chromium options a bit more and made a lengthy post on a ungoogled-chromium thread. I won't be posting more about it here.
@mikhoul I'm pretty confident that at least a fork will keep the webrequest API
That's great news!
@joey04 We'll have to agree to disagree on whether a browser or MITM approach is best
As you said, don't put all eggs in one basket. It's actually good if we disagree, more diversity means more resilience in time of crisis, if we end up starting 2 different projects, we'll have more things to work with the next time something unexpected comes up.
ungoogled-chromium
AFAIK, that project aims at implementing Chromium features that depends on Google API without using the real Google API. Chromium can run without Google API just fine, it will just be missing some features, and that project tries to restore those features. Overall, I think that project is rather unrelated to what we are trying to do.
@joey04 Just building a Chromium pull alone can take an hour or more depending on your CPU
You'll probably want 16 or more GB of RAM, but for CPU, an i5 is enough. You also need 100GB of hard disk space to store build cache. So yea, you need a decent computer to build Chromium, but worst case you can produce the initial build overnight.
Note that no matter how bad your computer is, as long as it is able to produce the first build, subsequent builds should be fairly fast thank to build cache.
@mikhoul just to add more infos, seems like firefox confirmed that they will not follow google mind about adblockers https://www.reddit.com/r/firefox/comments/ajlhip/firefox_product_manager_on_twitter_firefox_isnt/
@mikhoul just to add more infos, seems like firefox confirmed that they will not follow google mind about adblockers
Sadly it's just a Mozilla employee that give his personal opinion and nothing more, the other guy that made the first comment about the API is in charge of the security at Mozilla/Firefox so his comment are weighting lot more since he is part of the team that decide what is secure or not.
Don't get me wrong I would be really happy if Mozilla would not neuter any API with V3, it would make one more alternative for everybody to use.
On another point like @gorhill has said there is another serious issue with the V3 Manifest with the replacement of background pages: https://github.com/uBlockOrigin/uBlock-issues/issues/338#issuecomment-457609180 . Its also a serous issue for more than 50% of existing extension IMO.
Regards :octocat:
Sadly it's just a Mozilla employee that give his personal opinion
He is a Firefox product manager and "responsible for the browser roadmap."
the other guy that made the first comment about the API is in charge of the security at Mozilla/Firefox so his comment are weighting lot more since he is part of the team that decide what is secure or not
Um, no. Even otherwise, the PM said "not happening," while the sec guy said "not bad." The latter isn't a statement of intent, while the former is.
Interesting wrinkle. The tweet by Asa Dotzler, product manager at Mozilla, reads: "What makes you say Firefox is doing anything. This announcement was about Chrome, not Firefox."
I wouldn't read too much into it, and actually it's not that encouraging. I agree with the top-rated comment on the Reddit thread:
that's not the outright denial I would like to hear from Mozilla. Their marketing group should be working overtime getting the message out that Firefox will not be crippling ad-blockers. Mozilla is being handed a great opportunity, they should use it to their advantage.
There's still hope Mozilla doesn't screw the pooch here, but I'm skeptical they will in the long run. I don't trust them.
For now, ESR remains my primary browser. For much more info on my setup and thoughts on browser options, see this thread.
@joey04
Librefox looks interesting. I hope it can get beyond the frst steps. It aims to create a security and freedom of choice focused fork that is always up to date with the current version by automating the patching & building process
I never heard of Librefox before, but it appears to be a rebuild with very customized settings. Unfortunately, as a rebuild its capabilities are completely tethered to Mozilla. Not a good thing for the long run probably.
it all depends on how much developing work can be put in in the end. unless you write a completley new browser from the ground up you always have to deal with what upstream is doing. the key question is how well the patching and building can be automated. if you can fix a stupid decision from mozilla with a simple patch that keeps working for years and then only need minor tweaks to keep working again then even a single developer could offer an up to date fork. if you have to rebase every change every major version then it is hardly doable. it all depends on intelligent merging and development tools. coding the actual differences between librefox and vanilla firefox in th form of patches is fairly minor compared to the code base and most of it will just be to revert the damage that mozilla has done and will be doing.
I'm starting to prototype a native app that emulates WebExtension APIs, let's see how it goes.
I'm starting to prototype a native app that emulates WebExtension APIs, let's see how it goes.
It's Good thing and let us know how the progress goes !
Regards :octocat:
@jspenguin2017
By native app do you mean a standalone proxy program or do you mean native code intended to be added to the browser itself similar to what Brave does with it's ad blocking. Also, you might want to take a look at the mitmproxy project if you haven't already.
@nl255 I mean a proxy server. The current plan is Node + Electron / Puppeteer. We'll see.
@elypter the key question is how well the patching and building can be automated it all depends on intelligent merging and development tools.
What does Librefox use that other projects don't?
I've not used Mercurial (or even Git in a non-trivial way) but I was under the impression it could be readily integrated with your Diff tool of choice.
Hi,
I really like Nano more than UBO but I was asking to myself what will be your direction if the proposal is adopted and that the actual API is neutered ?
My understanding is that Chrome (for me ) would become inferior to Firefox for adblockers (unless Firefox implement the same restriction which I doubt).
If the change happen they can happen in the next 12 months which is relatively fast so the solutions/alternatives I've see are:
Maybe there is other solutions that I don't foresee. π€
What is your take on this @jspenguin2017 ?