Open selfup opened 3 years ago
I could not reproduce, either using your code or trying a smaller example 😞
Hmm could you both share your graphics cards (and driver if relevant)
@mockersf
I have updated my repo to have the latest 0.5 changes so you can try it out without checking out to a different branch:
https://github.com/selfup/ionized_earth_rpg
@cart
GPU: 1660ti
Currently on 461.72 - Going to upgrade to 466.11
Edit: Same issue with new driver.
Rust version and target info on Windows 10:
stable-x86_64-pc-windows-msvc - rustc 1.51.0 (2fd73fabe 2021-03-23)
Tried 1080p max window, same issue.
Fine when window is in the original size.
Let me know if you need any additional information!
I'm able to repro on my computer (nvidia gtx 1070, arch linux, proprietary drivers), but only if I vertically resize the window. Some vertical scales produce the artifacts (identical to the screenshot above), and others don't. This bug is identical to one that this commit resolved, which was later removed. Re-adding that offset resolves the issue in ionized_earth_rpg
. Ideally shaders shouldn't need to compensate for this (as custom user shaders won't be aware of the need). We should try to fix the problem in a "global" way instead of re-adding the shader hack.
Not sure if I'm understanding the source of the problem correctly.. but if it's the surrounding sprites bleeding into the current one then I think 1px of transparent padding and TextureAtlas::from_grid_with_padding()
might be the way to go here.
I'm running into the same issue. Win 10, last updates, Nvidia RTX 2060, drivers 511.79.
The issue is happening regardless of window resize and of sprite size.
The leaking edge is not always the same, depending on the position of the sprite. I made a video with adjacent tiles colored pink:
https://user-images.githubusercontent.com/2582791/154336332-805fcd31-8ac0-4433-a8a1-129518886c6f.mp4
Any idea how we could help to track down this issue?
Yeah padding sprite sheets is a "generally accepted" resolution to this. I'd like to sort out a fix for unpadded sheets, but resolving these rounding errors has been pretty nasty (and sometimes platform specific?). As mentioned in https://github.com/StarArawn/bevy_ecs_tilemap/issues/123, one option is to move to a texture array representation internally, but off the top of my head idk what the full implications of that move would be. For example, im pretty sure the array would need each layer to be the same size, so each sprite would need to take up the space of the "largest" sprite in the array. TextureAtlases can be more "compressed".
Pulling in @StarArawn as this is relevant to what they're doing.
Array textures can be best described as a texture that has layers. The texture's size remains constant throughout the layers which is one of the biggest disadvantages of using them. The advantages are you can filter and mipmap them just like you would a single texture meaning you don't need padding, you don't need weird filtering rules/setups or even manually filtering in shader(ew), and depending on your art style you may want linear filtering!
Yes! The cost of using the array texture with differing sizes is that every texture has to be the max size of your largest sprite. This leads to some funky sizing of sprites as well since you'll want the size of the sprite to match the size of the array texture not the actual sprite texture size. Since the extra pixels are transparent this usually isn't an issue. The increased memory is a bummer but fairly minimal when working with 2D!
You probably don't, but you might? If you implement any sort of zooming you'll soon see artifacts where the GPU has a hard time picking a good texture pixel for a given screen pixel. This results in all sorts of weird looking artifacts like this:
In bevy_ecs_tilemap
I use a hybrid approach where I support both atlases and texture arrays. I think it should be possible to do this in your own code or even in bevy code. There's no reason why we can't have both! :)
I meant to say something about pixel perfect cameras which can also help with these things especially reducing floating point inaccuracies for 2D games.
Gentle ping about this since the 0.6 renderer rework landed since this issue was opened. Is this still desirable/reproducible?
I have upgraded bevy to 0.6 then 0.6.1 and I still have the issue on Windows when maximizing the window. I'll have to check on my other machines. This seems to affect different GPUs/OSs differently so I was just confirming that my initial report is still accurate. Thanks for asking!
Just as FYI, the issue has been reported in a discussion a few days ago: https://github.com/bevyengine/bevy/discussions/4424 and i've also run into it separately. There was also a brief exchange on discord
Thank you for the cross reference!
An update:
I’m trying to make some progress on this because in my opinion it has a relatively high impact for 2D games using sprite sheets. At the same time I'm definitely not über knowledgeable, which makes things tricky at best.
After some discussion on Discord with Rob Swain and Nical it seems that the general agreed upon solution is (quote):
clamping the UVs in the shader to not go past the 0.5 texel border in order to not incur sampling from texels outside the sprite bounds
which is what Webrender does.
As Rob said (quote):
the clamping should happen between
(sprite.offset + vec2<f32>(0.5)) * texel_size
and(sprite.offset + sprite.size - vec2<f32>(0.5)) * texel_size
My understanding is that sprite offset and sprite size (for sprites from texture atlases) is not data available in the sprite shader (it is shared for all sprites, regardless if they come from an atlas or not). Is this correct?
If that’s correct (it might not be), it leads to these possibilities, and i’m not sure what’s best:
textureDimensions(sprite_texture)
and sprite offset is zero anyway)What are your thoughts on this? Is the above accurate? Thanks a lot
As the information is only needed for texture atlas sprites it feels like there should be a way to make a uniform for that case and only use it in that case. Coming up with a good overall solution will need a bit more thought that I don’t have time for in this moment. I’ll try to remember to revisit this but if I forget, poke me on Discord.
Found the same off by one
behavior where a tile leaks if I resize my window. Usually happens if you have an odd-sized window, and periodically fixes itself at multiples of 2 or 8 or so.
I have descaled my textures while I was debugging this (and pulling some hairs out thinking I was doing something very stupid :grimacing: ) and the behavior is the same.
Additional Info:
Bug Happening (leaking the LAST ROW of the tile ABOVE the grass tile)
Bug Happening (leaking the RIGHTMOST ROW of the tile LEFT the grass tile) (a little difficult to see because the tile at the left is a dark one.)
Both happening at the same time:
The bug is fully resolved by having an even-sized resolution at both dimensions:
Another bit of info about my report above:
My desktop has a little bit of scaling applied to it, for making things a little bit bigger. So, I don't know if it's my desktop scaling that triggers this, but in my Bevy game, if I don't set a scale_factor_override
to Some(1.0)
, then the temporary fix I explained above of setting the window resolution to an even number becomes basically impossible. The resolution becomes very... floating point-y :grimacing: like this screenshot is showing in the title bar: 798.5455 x 547.2
:cry:
hey just maybe worth noting that I did not have this issue on 0.5.0, but did with 0.7.0 and 0.8.0 as referenced here: https://github.com/bevyengine/bevy/issues/5510
I think it's very device specific and small tweaks to shaders can fix it in one place and break it in other places so that might be what's happening between Bevy 0.5 and the later versions on your graphics card.
I've submitted fixes for this on a couple Bevy tilemap renderers that have seemed to work ( https://github.com/StarArawn/bevy_ecs_tilemap/pull/197 & https://github.com/forbjok/bevy_simple_tilemap/pull/7 ). I also might be able to do a better version without branching in the shader, but I'll have to test it out.
If I get the chance I'll try to see what fixing this in Bevy's sprite renderer might look like.
I can confirm this issue is still present in latest (0.11) When using from_grid() pixels from tiles underneath the rows are showing up on the upper tiles. This happens when maximizing the window. I suspect its something with the scaling with f32. I think the images need to be actually extracted first from tilesheet before applying any scaling or probably any modification. Im also using nearest interpolation. Adding padding may work, vut I'm concerned if the interpolation will actually show transparency and therefore the clear color instead of ignoring it. I dont know if nearest considers transparency like that.
System Intel W-1290T / Radeon Pro WX4100 Debian Testing / Gnome / Wayland (although it's building for x11, so xwayland bevy *
I experience this as well.
I mitigated the issue very slightly by adjusting padding with from_grid
but not to the point that I eliminated it entirely. So I think I am going to revert back to loading individual sprites rather than a sheet for now.
It's affected by zooming the projection's scale. Sometimes the lines disappear from specific areas when I zoom in/out.
WASM/Chrome testing generated from an x86_64-unknown-linux-gnu build.
I was able to sufficiently mitigate this issue for my use case without modifying the underlying assets. Here's my current understanding just in case it helps anyone going forward.
This issue is caused by a combination of projection.scale
+ bleeding. An individual sprite may look correct at a given projection.scale
and incorrect at another due to how these interact. For me, since I render a tile grid which is fit to the window width, and offer the ability to pan/zoom on this grid, I was seeing bleed flickering as I zoomed in/out and when I resized my window.
Since the issue is caused by bleeding between sprites, until a fix for the underlying issue is implemented, the goal is to bleed gracefully. This means that the bled-into pixel color should match the desired pixel color. This means you don't want to surround sprites with transparency. You want to surround them with their neighboring colors.
For complex sprites, the solution to this would be to "extrude" the colors of each sprite, increasing the size of the sprites, and allow the bleed to occur. The bleed effect reads the extruded color value, which matches the neighboring color value, and results in the desired visual effect.
For simple sprites, you don't necessarily need to modify your sprite assets. Instead, you can achieve a similar effect by adjusting the from_grid
call.
Consider the sprite sheet below. It is one column of 16 sprites. Each sprite is 128x128. 15 of the sprites have a thick, black border along one or more of their edges. There is no padding between them.
The "correct" way to generate a texture atlas from this asset is:
texture_atlases.add(TextureAtlas::from_grid(
texture_handle,
Vec2::splat(128.0),
1,
16,
None,
None,
))
However, this exhibits the undesired bleed effect. For my scenario, this occurs with these settings:
custom_size = Some(Vec2::Splat(1.0))
. projection.scale
. This results in a projection.scale
of 0.0525
. The specific scale isn't important - just providing context.Here is how this renders. Note the black border bleeding into various sprites. Also note that if I resize the window then the affected sprites change.
Now, consider an adjusted call to from_grid
of the form:
texture_atlases.add(TextureAtlas::from_grid(
texture_handle,
Vec2::new(128.0, 120.0),
1,
16,
Some(Vec2::new(0.0, 8.0)),
Some(Vec2::new(0.0, 4.0)),
))
Here I keep my x-axis as-is because I am working with a single column sprite sheet. I reduce the size of my sprite along the y-axis, then add an equal amount to y-padding (not half!), and add half the amount to y-offset.
Here is the result:
This eliminates the bleed artifacts.
Note that using a sufficiently large reduction in tile_size
is necessary to achieve the desired result. Originally, I thought this approach was not a valid solution because I reduced tile_size
by just 2 and still saw bleeding on a full-size window. Then, I reduced by 4, setting my values to 124.0 // 4.0 // 2.0, which worked for a full-size window, but did not hold for a resized window. This is because my resized window had different subpixel rounding due to a varied projection.scale
.
The final solution was to work my way down through reduced tile_size
values until I found an aggressive enough setting to eliminate artifacts at all resolutions I intend to support.
UPDATE:
As my awareness of this issue increases, so does the length of this post :)
I ended up moving away from a homegrown tilemap solution for performance reasons. My 144x144 tilemap crashed on my test iOS device where it did not crash before the introduction of sprite sheets. The lowest hanging fruit performance fix was to adopt "meshing" in which multiple sprites are combined into larger mesh chunks. This functionality is implemented already in bevy_ecs_tilemap
(https://github.com/StarArawn/bevy_ecs_tilemap/tree/main/src/render) and bevy_simple_tilemap
. I tried using the simple version first, but tiles aren't entities in its implementation which made updates too challenging. So, I adopted bevy_ecs_tilemap
.
bevy_ecs_tilemap
, when running in WASM targeting WebGL2, makes use of atlases internally, but its external interface expects texture: Handle<Image>
, i.e. the raw sprite sheet not a TextureAtlas
. bevy_ecs_tilemap
exposes TilemapSpacing
, but making use of it did not prevent bleeding.
After talking with the maintainer of bevy_ecs_tilemap
I was instructed to disable Msaa
:
app.insert_resource(Msaa::Off);
This fixed my bleed issue without any padding hacks.
So, I would encourage others to use this approach first and only to explore more complex, padded solutions afterward.
Bevy version
bevy = "0.5"
Operating system & version
Windows 10
Can test on Ubuntu 20.04 as well as MacOS Catalina
What you did
Upgrade to Bevy 0.5 from 0.4
What you expected to happen
Expect sprite sheet animation to work the same as before
What actually happened
Once upgraded, when I full screen the window the sprite animation has artifacts from another position in the sprite sheet
Additional information
Here in my
player_setup.rs
: https://github.com/selfup/ionized_earth_rpg/blob/main/src/systems/player_setup.rs#L13This now has to be changed to:
To avoid the other sprite from leaking in.
Here is a small screenshot of the feet leaking in above the player. Looks like two black bars:
Short video showing the artifacts leaking through when moving around.
https://user-images.githubusercontent.com/9837366/115106199-a1f6ad80-9f20-11eb-9e02-ddbbcd6860b7.mp4