Open FlameSoulis opened 11 months ago
It's something we could add. There is CEF, which uses chromium and is relatively easy to integrate.
The bigger problem here is synchronization. Synchronizing state of arbitrary web applications is pretty much near impossible. Which means we'd likely need a partuclar user driving and rendering and then streaming that to others as video feed, which is a relatively big functionality.
This means that ability to encode video will need to come as prerequisite before we can even consider this.
I'll make a comment since it might be a related issue. I feel that WebRTC or some other low latency protocol should be used to decode the video stream. Although it is not a browser, it can achieve low-latency delivery.(This eliminates the need for kokolive or other low-latency rtsp servers.) Is this included in the steps to implement the browser? (Should I create it as another issue)
It's something we could add. There is CEF, which uses chromium and is relatively easy to integrate.
Sounds like a new daemon/service would be running on the side sort of like YT-DLP. Neat!
As for syncing, I think you'd be one of the first to figure it out, since for other systems that do offer it, syncing was never really achieved. Though, at the same time, it might be advisable to include a 'non-synced' mode for situations where you wouldn't need the service host to update everyone, such as a log-in screen or a web-based menu system.
I second a nonsynced/nonstreamed mode, for games like Jackbox. I'd also like to propose a mode that syncs scroll, zoom, address, and so on, but still renders locally, for use with static webpages that wouldn't vary between users.
I don't really see much benefit in doing this in a non-synced way - we'd only do that locally in your dash and you can already use the Desktop tab there to just access your normal web browser.
Syncing scroll, zoom, address and such is not sufficient to achieve sync - the webpages are not guaranteed to be in the same state - e.g. consider ones that expand or change size dynamically based on user input - which is not synced. This would lead them to be actually "desynced" at time, which would be confusing or even detrimental - because the website is forced to show different part of each user, so one user moving it to one place to see a thing, will move the thing other user wants to see away from their view.
Is there an issue open for video encode? It'll open up possibilities for more than just this, so I'd like to upvote that if it exists.
Is your feature request related to a problem? Please describe.
Viewing web content in-world is usually only possible via the Desktop viewer or some intense Protoflux sorcery. A Canvas component that can link to a webpage could lead to exciting functions and ideas, including more accessible access to web-related services such as Resonite's Support.
Describe the solution you'd like
A potential solution would be to add a new component for UI systems that renders a webpage onto the canvas' surface.
Describe alternatives you've considered
The only other solution I can think of is to rely on some kind of proxy service that renders a webpage into a PDF or image and is then loaded into the world as a resource.
Additional Context
This does bring up the unfortunate question of which backend renderer to use, such as Gecko, Chromium, or some other system. This also brings up the potential for web engine-based exploits to haunt Resonite's systems, which have become increasingly common.