pythonarcade / arcade

Easy to use Python library for creating 2D arcade games.
http://arcade.academy
Other
1.68k stars 320 forks source link

Isometric support #1729

Open DragonMoffon opened 1 year ago

DragonMoffon commented 1 year ago

Isometric has repeatedly come up on the Discord so im putting this here as a reminder and checklist.

This requires:

Some cool extra things would be:

pushfoo commented 1 year ago

tl;dr I think we should focus on adding a good sprite ordering abstraction instead of polishing isometric support

We currently don't have an easy way to efficiently update sprite draw order, and it's important for more than just isometric games. For example, a more top-down game like Undertale also needs to draw sprites in order based on their bottom edge's Y value.

The most accessible way is currently calling SpriteList.sort, but it's very slow. Is the best way to do this using a different shader with a SpriteList? My rudimentary understanding of GLSL is as follows:

  1. Shaders have access to the depth value somehow
  2. This may not work when alpha is involved
Cleptomania commented 1 year ago

We have depth testing in Arcade 3.0 now I thought?. There is a depth parameter on the Sprite that to my knowledge should be working.

I could be wrong about that and the shaders/SpriteList might not be updated yet, @einarf would know better as I believe he added the parameter. If this isn't working yet, I think that was the intended path forward.

It may be the case that this was left un-done, with the plan to have a way to enable it optionally? Ordering based on the z-index can have unnecessary performance problems. While it is obviously less problematic than sorting the SpriteList, it's probably an unnecessary performance impact to use a depth buffer to sort the sprites, I think shader wise this would be about the heaviest operation the sprite shader performs if it were doing so.

einarf commented 1 year ago

I ensured the window is created with depth buffer. If you ctx.enable(ctx.DEPTH_TEST) it should work. There is also example in experimental.

tor. 27. apr. 2023, 02:09 skrev Darren Eberly @.***>:

We have depth testing in Arcade 3.0 now I thought?. There is a depth parameter on the Sprite that to my knowledge should be working.

I could be wrong about that and the shaders/SpriteList might not be updated yet, @einarf https://github.com/einarf would know better as I believe he added the parameter. If this isn't working yet, I think that was the intended path forward

— Reply to this email directly, view it on GitHub https://github.com/pythonarcade/arcade/issues/1729#issuecomment-1524253361, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABJKZRFDPEGE2YURC7W34BDXDG2J3ANCNFSM6AAAAAAXNBR5TM . You are receiving this because you were mentioned.Message ID: @.***>

pushfoo commented 1 year ago

There is also example in experimental

On development, run it with:

python -m arcade.experimental.sprite_depth

I've also made a follow-up PR linked above, and fixed some bugs in a related example (#1732).

While it is obviously less problematic than sorting the SpriteList, it's probably an unnecessary performance impact to use a depth buffer to sort the sprites

I agree that it's worth looking into optimized alternatives for specific common behaviors. However, we already have the feature partially implemented. It's worth considering ways to abstract it in a user-friendly way which limits performance impacts.

To my understanding, one way could be variants of SpriteList which wrap the original draw with gl.Context.enabled internally. The following names are ungainly examples, but they get the point across:

They also seem like acceptable short-term solutions compared to the SpriteList.sort solution as long we do the following in their docstrings:

If there are better backing implementations, we can plan on replacing the insides later after the prototype is working with a clean API.

Cleptomania commented 1 year ago

I think the only real concern I have about performance related to depth testing is do we have GPUs that don’t have a hardware depth buffer? I don’t think OpenGL actually makes any indication about wether or not a GPU has to implement something on a hardware level vs emulate it in software in the driver.

If a GPU has a hardware depth buffer(and most do, I’m worried about the edge cases here), then enabling the depth testing just by default has probably a negligible to literally no draw-back. I’m not even sure how to answer this question unless @einarf has a better of what hardware support for this is like.

My GUESS would be that we are fine to enable it by default, even the raspberry pi 4 to my knowledge has a hardware depth buffer(but there are some cases where you can’t have a depth buffer based on how many/size of render targets you have or something, can’t remember but don’t think it effects us really).

DragonMoffon commented 1 year ago

The other issue with the depth buffer is that it currently does not support semi-transparent sprites.