Inochi2D / inochi2d-rs

Rust bindings for Inochi2D
BSD 2-Clause "Simplified" License
14 stars 4 forks source link

Implement Send on structs and classes that's Send-safe #5

Open togetherwithasteria opened 1 year ago

togetherwithasteria commented 1 year ago

For my little game project built with Bevy, it'll be nice to have Send traits implemented on structs or classes that will just be fine with them, since it's a requirement for Bundles and Components in Bevy. The safest I think being those that don't depend on OpenGL APIs.

Otherwise, the only way to store them would be trough NonSend resources which is quite a maintenance burden, especially long-term.

lain-dono commented 1 year ago

I could try porting a reference implementation to rust/wgpu/bevy. It doesn't look like much code there. Unless you count the editor.

There is another implementation at https://linkmauve.fr/dev/inochi2d/. But I can do much better than that.

togetherwithasteria commented 1 year ago

I could try porting a reference implementation to rust/wgpu/bevy. It doesn't look like much code there. Unless you count the editor.

There is another implementation at https://linkmauve.fr/dev/inochi2d/. But I can do much better than that.

Yes, a whole new implementation would much simplify my tech stack rather than having to FFI into Dlang. It's also the general vision of upstream for the future of Inochi2D.

No dealing with GCs and less build dependencies to download.

However keeping things on par with upstream would be difficult for the long term! At least when most contributors are unpaid volunteers.

This might be possible if we have a corporate sponsor, like major open source projects do. However, at the moment, no game studios or vtubing agencies have been expressing their interest on Inochi2D and surrounding ecosystem.

togetherwithasteria commented 1 year ago

But I think please work on making a better abstraction for Inochi!

I have a prototype repository here: https://github.com/project-flara/inochi-bevy but it doesn't work yet since it runs all systems outside of the render stages, and then it doesn't expose an ECS API.

I am currently on vacation, so you may look into the code and work from that. ^^

lain-dono commented 1 year ago

My goal is runtime for general purpose animations. At least for 2d at the moment. I'm pretty sure I can make it better than the current implementation in Inochi2D.

The reference implementation has some problems. For example there is no separation between serialization, runtime itself and graphical backend. Also I guess the performance is not very good (although this is not critical for Inochi2D, it might be a problem for more general library). I actually have some thoughts on improving the serialisation format as well (but fixing that is not part of my current tasks).

In any case, I'll see what I can do with it all eventually. Maybe I can make an implementation that will eventually replace the current one entirely.

But I think please work on making a better abstraction for Inochi!

It would be easier for me to write my own implementation from scratch. The original implementation is not well suited for embedding. Especially if we're talking about languages like Rust or C (or something that uses them like Python). It wasn't really designed with that in mind from the beginning.

However keeping things on par with upstream would be difficult for the long term! At least when most contributors are unpaid volunteers.

For me, funding is not a significant issue at the moment. Although I do have some plans for next year. They concern bevy and some other things including 2d animation (as a small task).

togetherwithasteria commented 1 year ago

The reference implementation has some problems. For example there is no separation between serialization, runtime itself and graphical backend. Also I guess the performance is not very good (although this is not critical for Inochi2D, it might be a problem for more general library). I actually have some thoughts on improving the serialisation format as well (but fixing that is not part of my current tasks).

Ah, I see. Well, we can try to implement a new implementation, but I think we could try embedding the reference implementation while we stabilize the new implementation?

togetherwithasteria commented 1 year ago

By the wayy, @lain-dono. @Speykious has also been developing another implementation in Rust: https://github.com/Speykious/inox2d

But ideally we would like to have in wgpu?

lain-dono commented 1 year ago

I don't think it's necessary. I guess I'll have a working implementation next week.

lain-dono commented 1 year ago

wgpu is needed for seamless integration with bevy.

Speykious commented 1 year ago

By the wayy, @lain-dono. @Speykious has also been developing another implementation in Rust: https://github.com/Speykious/inox2d

But ideally we would like to have in wgpu?

The reason I'm making this implementation is to eventually make a WGPU renderer for it, since D's garbage collector is incompatible with WASM. Currently there's only an OpenGL ES 2.0 renderer which is almost identical to Link Mauve's implementation over at https://crates.io/crates/inochi2d, but my goal for now is to first implement everything that's not rendering (deform, physics, parameters).

Speykious commented 1 year ago

I don't think it's necessary. I guess I'll have a working implementation next week.

If you get a working WGPU implementation, do let me know! I'm currently redesigning my node system to not use Serde because I realized it wasn't worth using it while also trying to keep my node system extensible. Simple manual JSON serialization via the json crate is gonna simplify a lot of things and remove like 4 dependencies off my list.

lain-dono commented 1 year ago

What I do might be a little more complicated. I already have a clear separation of read asset data, runtime and rendering backend (plus intermediate layers).

The internal structure of runtime may be very different from how it works in INP/INX (ok, it seems to be the same thing). For example I plan to store nodes in Vec. This way I get rid of indirection access and recursion. This should lead to slightly better performance. The cost of this is the loss of the ability to freely destroy nodes (at least it becomes very expensive). Perhaps a node will be called a bone here. In fact it's pretty much the same thing.

I plan to leave some compatibility with the way puppets are rendered now. But there are some changes planned here as well. The most important is the move to a linear colour space. Mostly Rgba8UnormSrgb instead of Rgba8Unorm or something like that, but you also have to make sure you don't get the blending modes wrong. Plus a lot of data can be loaded in batch. For example there is a trick using instanced draw call, which allows you to transfer data via vertex buffers instead of uniform. And some other finer details.

In addition I would like to do full integration in bevy. That means it would practically have its own randomization and backend (though a lot could probably be reused). There are some interesting problems there. For example, using stencil buffer requires a couple of fun tricks. The rendering part of bevy is practically undocumented (unless you count the API).

Why is it so complicated? My goals include a user-friendly FFI, pure wgpu backend, seamless integration with bevy, ability to replace the reference implementation, and support for other formats like Spine2d/DragonBones/Live2D (and possibly my own format). Perhaps something else in the future. That said, I would like to retain some flexibility. So it will be possible to use only those parts that are needed. In particular this will mean being able to use the library without wgpu/bevy.

I hope to get wgpu rendering in the next couple of days and finish with deform/physics/parameters next week. Ideally the same goes for bevy integration (also next week).

Speykious commented 1 year ago

For example I plan to store nodes in Vec.

FYI, I already store my nodes in an indextree. You can probably use that instead of a Vec since you won't need to implement everything about it yourself.

togetherwithasteria commented 1 year ago

Wait, so each of us is going with our own implementations? Will that not create a fragmentation?

Anyways. i think we should close this issue and create a new one since it's derailed so much.

LunaTheFoxgirl commented 1 year ago

@lain-dono Inochi2D already is in linear color space, textures get converted on load away from SRGB and the framebuffer gets converted to SRGB on the fly. Using linear color and premultiplied alpha is a MUST for the spec. Otherwise you shouldn't call your implementation Inochi2D compatible.

Also please don't use git issues as an advertisement platform. Do note the perf of Inochi2D as is right now is lower because it's a reference implementation made to be easy to read, the more generic and optimised you to the more you have to sacrifice readability (on top of rust having terrible readability to begin with)

We will be making a high perf version of Inochi2D down the line that utilises things like SIMD instructions and the like to speed up calculations, as well as reduce unnecessary state changes.

Inochi2D is OOP for maintainability reasons. Rust does not for games, in my opinion, allow for maintainable code to be written in the long term as it doesn't support the patterns that work best for games. As such I feel making any extensible 2D animation subsystem like Inochi2D that can be customised at runtime while also keeping it maintainable will be a tall order.

Either way, with that said please get back on topic for the issue