Modern browsers are becoming increasingly large and complex, making it almost impossible for new vendors to enter the browser development space in the foreseeable future. This, to some extent, stifles innovation and vitality.
There have long been conspiracy theories suggesting that browsers have become some sort of moat, representing monopolistic behavior. While I don't believe in these conspiracies, their impact on reality is objectively present.
Therefore, this issue attempts to explore a new approach, hoping to start from the standards layer of the Web and avoid the issues of browser vendor development becoming entrenched in the next 10 to 20 years.
The preliminary idea is to break down browser functionalities into finer collections:
New features are typically only added to the latest collections.
Functional collections often have certain overlaps.
In new collections, even breaking changes to functionalities in old collections are allowed.
Source code needs to explicitly declare the functionality sets being used.
A browser implementation does not need to cover all functionality sets; it usually only needs to cover the latest functionality set, while older functionality sets can use emulation.
The maintenance cycle for a collection could be 5 to 15 years or even longer.
Each functionality set can typically operate independently, allowing different browser vendors to mix and match and innovate based on that.
For example, if a website has not been updated for 20 years, then after 20 years, if the old technology set is discontinued, we can still use emulation techniques to render this website, as the performance is usually sufficient to meet the performance needs of the site from 20 years ago. If performance requirements cannot be met, the browser would maintain support for this functionality set.
We can almost guarantee that the performance of devices in 10 years will be several times that of today's devices. Even if Moore's Law is gradually failing, in the foreseeable future, chip engineers can still design more powerful chips through architectural innovation.
Some current examples include: WASM can already emulate Windows 2000, which stopped being maintained in 2005. Therefore, for a website from 20 years ago, if there is still a demand for use, browser vendors can achieve compatible rendering through built-in emulators. This example might be extreme, as emulation does not necessarily need to start from the operating system; after all, our goal is merely compatibility. I just want to emphasize that many of the technical burdens we face today can be gradually discarded through emulation in the future.
With the development of powerful technologies like WASM and WebGPU, in theory, we could smoothly emulate and run websites from 20 years ago on hardware from 10 years ago.
Simulating old technology standards on new technical standards means that browser vendors can directly drop support for old technologies in their minimum binary releases, while different browser vendors can share this compatibility layer.
From a software engineering perspective, this could effectively reduce the difficulty of browser development, making it more likely for new technical architectures to be adopted.
Expanding on this idea, imagine if there were a technology called CSS-X that could customize the parsing process via JS/WASM, combined with intervention interfaces for styles, layouts, rendering, and composition (similar to CSS-Houdini's worklet). This could theoretically enable emulation of CSS3, meaning browser vendors might no longer need to implement CSS standards, allowing the community to boldly explore new syntax collections. Technologies like Flutter and Compose would no longer need to rely on Canvas for rendering on the web platform, while still maintaining accessibility support.
I see that JS0/JSSugar has a similar idea, hoping to allow browser vendors to focus more on the underlying layers, effectively linking with the operating system (or even being the operating system itself), while the community is responsible for developing standards. This will allow technology to evolve more vibrantly, ultimately benefiting a broad range of users.
Developers need not worry about emulators; the browser will automatically download and install the necessary emulators from trusted sources, ensuring that websites render correctly.
Modern browsers are becoming increasingly large and complex, making it almost impossible for new vendors to enter the browser development space in the foreseeable future. This, to some extent, stifles innovation and vitality.
There have long been conspiracy theories suggesting that browsers have become some sort of moat, representing monopolistic behavior. While I don't believe in these conspiracies, their impact on reality is objectively present.
Therefore, this issue attempts to explore a new approach, hoping to start from the standards layer of the Web and avoid the issues of browser vendor development becoming entrenched in the next 10 to 20 years.
The preliminary idea is to break down browser functionalities into finer collections:
For example, if a website has not been updated for 20 years, then after 20 years, if the old technology set is discontinued, we can still use emulation techniques to render this website, as the performance is usually sufficient to meet the performance needs of the site from 20 years ago. If performance requirements cannot be met, the browser would maintain support for this functionality set.
We can almost guarantee that the performance of devices in 10 years will be several times that of today's devices. Even if Moore's Law is gradually failing, in the foreseeable future, chip engineers can still design more powerful chips through architectural innovation.
Some current examples include: WASM can already emulate Windows 2000, which stopped being maintained in 2005. Therefore, for a website from 20 years ago, if there is still a demand for use, browser vendors can achieve compatible rendering through built-in emulators. This example might be extreme, as emulation does not necessarily need to start from the operating system; after all, our goal is merely compatibility. I just want to emphasize that many of the technical burdens we face today can be gradually discarded through emulation in the future.
With the development of powerful technologies like WASM and WebGPU, in theory, we could smoothly emulate and run websites from 20 years ago on hardware from 10 years ago.
Simulating old technology standards on new technical standards means that browser vendors can directly drop support for old technologies in their minimum binary releases, while different browser vendors can share this compatibility layer.
From a software engineering perspective, this could effectively reduce the difficulty of browser development, making it more likely for new technical architectures to be adopted.
Expanding on this idea, imagine if there were a technology called CSS-X that could customize the parsing process via JS/WASM, combined with intervention interfaces for styles, layouts, rendering, and composition (similar to CSS-Houdini's worklet). This could theoretically enable emulation of CSS3, meaning browser vendors might no longer need to implement CSS standards, allowing the community to boldly explore new syntax collections. Technologies like Flutter and Compose would no longer need to rely on Canvas for rendering on the web platform, while still maintaining accessibility support.
I see that JS0/JSSugar has a similar idea, hoping to allow browser vendors to focus more on the underlying layers, effectively linking with the operating system (or even being the operating system itself), while the community is responsible for developing standards. This will allow technology to evolve more vibrantly, ultimately benefiting a broad range of users.