JetBrains / compose-multiplatform

Compose Multiplatform, a modern UI framework for Kotlin that makes building performant and beautiful user interfaces easy and enjoyable.
https://jetbrains.com/lp/compose-multiplatform
Apache License 2.0
15.93k stars 1.16k forks source link

LWJGL Integration #652

Open smallshen opened 3 years ago

smallshen commented 3 years ago

Since it uses skia, it is possible to use jetpack compose in LWJGL, it would help build ui in lwjgl application. Such as UI in Minecraft Mods

https://github.com/semoro/MCCompose https://github.com/smallshen/JetpackComposeMinecraft

Those two are able to use but not perfect, not with new version, can't load on windows and etc.

It would be good to have official support with LWJGL integration.

olonho commented 3 years ago

Contributions are welcome in this direction.

igordmn commented 2 years ago

Check the example. 80-90% of features work.

Officially we still don't support LWJGL.

smallshen commented 2 years ago

Thanks for the work, already make Compose working in Minecraft, waiting to get rid of awt events

smallshen commented 2 years ago

An issue with cursor position when trying to implement more things with OpenGL. https://github.com/JetBrains/compose-jb/issues/652

smallshen commented 2 years ago

The cursor on the text field is not working.

Cursor is fixed by providing a LocalWindowInfo, androidx.compose.ui.platform.WindowInfo

akurasov commented 2 years ago

Maybe you could share a bit more info on your solution, so others could use is too?

jsixface commented 2 years ago

Question: I see the LWJGL integration example uses OpenGL . Is it possible to use OpenGL ES to render instead of OpenGL? I saw Skia support rendering to OpenGL ES through ANGLE. If it is possible what steps would that take?

hakanai commented 1 year ago

Here's a more cursed example based somewhat on the LWJGL example which shows what you have to go through to render to an offscreen buffer. I got to the point where it renders, but handling input looks like it will be way too hard without putting a lot of time into research.

The other use case I have for this sort of integration is pretty much the exact opposite - there's an application I want to write where the main UI is all Compose, but then I want a preview which is a separate OpenGL context. Maybe I can fake it by copying the texture to an image and displaying the image, but the less copies I can get away with, the better.

daocaoren123 commented 1 year ago

The problem is probably the same with the desktop application I'm currently experimenting with, where I'm unable to combine the compose ui and opengl render data together

smallshen commented 1 year ago

@jsixface @hakanai rendering render compose inside Minecraft(which is lwjgl) is possible https://www.youtube.com/watch?v=DOSsVuRlzOI

But compose performance is not good enough for games when you have a complex UI. It's probably enough for small screens or a to-do list.

  1. Current desktop ComposeSence will re-run the layout after every key-mouse input event, (the older version doesn't have it) which blocks the thread.
  2. State management (updating a state from another thread or concurrently) is limited to one thread.
  3. no async layout. layout phase is blocking the rendering. In my example, with Minecraft. the UI content is changing all the time. which needs re-render every frame. Plus problems in 1 and 2. It's actually blocking the main thread(Redner thread) a lot.

Some not so accurate benchmark

Minecraft 1.19.2 (Fabric, Sodium, Iris, Faithful resource pack, playing on hypixel skywars)

UI is something like in this video https://www.youtube.com/watch?v=UMNsio6ah0I (with chat hud rewrite in compose)

the blur effect is also written in compose and skia

No Skia, no compose: 100-150 fps

Skia only, no compose: 100-130 fps

No Skia, compose only(states, effects): 45-90 fps

Full Compose: 35-80 fps

Multiple factors made compose not really ideal for games. If noria is open source or public, I would consider using noria in the game, based on the performance of the fleet.

hakanai commented 1 year ago

In my case, the comparison would probably have to be between a full app running in a normal window and then being injected in as an overlay using XSOverlay or similar, vs. an app running compose and throwing the texture directly at the overlay API.

Both run at about the same speed as far as I can tell from inside VR.

Throwing the texture directly at the overlay API should use less CPU resources overall, though, because in the case of the window, the framework still has to do a framebuffer swap to display the new content, but then some other application has to grab that window and then inject that as a texture into the overlay.

Only issue is, getting input to work for the overlay is way too hard. :(

okushnikov commented 3 weeks ago

Please check the following ticket on YouTrack for follow-ups to this issue. GitHub issues will be closed in the coming weeks.