alexheretic / glyph-brush

Fast GPU cached text rendering
Apache License 2.0
688 stars 52 forks source link

Feat/emoji_support #160

Open YaelLee opened 1 year ago

YaelLee commented 1 year ago

Support the Rendering of emoji.

  1. Maintain two mapping textures at the same time, one for the normal text and one for the emoji. Add an emoji_cache in GlyphBrush, and upload the textures separately. The data's type in outline_cache is u8, and in emoji_cache is u32.
  2. In order to recognize emojis in a string, the font_id in Text is changed to a vector type, thus supporting both normal fonts and emoji fonts.

Todo:

  1. I don't have a good idea about How should emoji align with baseline, so align the lower edge of the emoji's bounds with the baseline.
  2. Codepoints recognition is not yet supported, such like ZWJ.
截屏2023-02-16 21 07 45
alexheretic commented 1 year ago

I haven't had a super deep look at this yet. But my initial impressions:

I'm not sure the draw-cache should be generalised to handle rgba in this way. Drawing images using Font::glyph_raster_image is quite different to drawing outlines where subpixel position is important. A glyph_raster_image cache could be a much simpler thing with exactly one pre-drawn image per emoji and also has a greater case to be static.

I would probably approach it as a separate concept.

I'm also not sure moving to Vec<FontId> is the way to go for supporting font-chains, or even if they should be supported at all here.

Perhaps there is a more lightweight, modular & less breaking approach? It would be great if users were not affected by emoji functionality if they don't plan on actually using emojis. Perhaps the initial/minimum support would be allowing the layout code to leave appropriate space for emojis somehow so an external render logic could handle them (they could also still have vertices appended to the regular glyph vertices for a single draw in theory).

That could lead to an EmojiGlyphBrush that handles both textures, but maybe that should be in another crate.

To be clear I haven't thought deeply about such tradeoffs myself, so I don't have good answers. But I am interested in having a clearer picture of the different ways this can be supported and the tradeoffs they involve.

YaelLee commented 1 year ago

A glyph_raster_image cache could be a much simpler thing with exactly one pre-drawn image per emoji and also has a greater case to be static.

I think if one texture caches one emoji, it is not able to draw all the emoji in one draw call.

That could lead to an EmojiGlyphBrush that handles both textures, but maybe that should be in another crate.

Maybe the "emojibursh" shouldn't be in another crate, because we need to layout and shape the text and emoji at the same time, but draw them separately. So maintain two cache at the same time maybe a better way. If the users don't plan on actually using emoji, they just don't configure the emoji fonts, and emojis will not render.

alexheretic commented 1 year ago

A glyph_raster_image cache could be a much simpler thing with exactly one pre-drawn image per emoji and also has a greater case to be static.

I think if one texture caches one emoji, it is not able to draw all the emoji in one draw call.

I was talking about one image per emoji inside the cache (which has multiple emojis). Whereas with outlined glyphs every different sub-pixel position may require a separate draw to ensure high quality. So the draw cache may have multiple versions of the same glyph at the same scale. The current draw-cache is all about handling that.

Emojis using Font::glyph_raster_image are pre-baked images that you'll want to bundle together into a texture. You wouldn't want multiple for different sub-pixel positions. Nor for different glyph scales that result in the same GlyphImage.scale since you would want to avoid re-sizing them in the texture and just use the original pre-baked image sized by the shader.

Pre-baked images also don't have the same concerns about drawing being an expensive operation, whereas rasterization of outlines is. There's probably no need for dedicated multithreading code for a pre-baked image atlas.

So having an optimal pre-baked rgba image cache seems like a different enough problem that I'd try having it as a separate structure from the outline draw cache.

Maybe the "emojibursh" shouldn't be in another crate, because we need to layout and shape the text and emoji at the same time, but draw them separately.

Yep we do need to support it in the layout code. However, the only changes to layout you have made are the Vec<FontId> stuff. I don't think this is necessary, multiple fonts inside a single layout are already supported. Couldn't emojis usage be the same as multi-font usage?

I also think it would be possible to render both emojis and outlines in a single draw, by having both textures in the shader pipeline and including the necessary info in the instanced vertex. 2-draw would work too, but if _glyphbrush bundled it up to support single draw it would be up to the render-code to decide rather than dictated here. That would be an advantage of integrating the rgba texture cache in the crate maybe.

Maybe before integrating it would make sense to figure out with whats blocking implementing emojis alongside outlines as a user of _glyphbrush today. Or if it is already possible, what are the pain points.

Current state

I think layout already works with something like

Section::default()
    .add_text(Text::new("Hello "))
    .add_text(Text::new("😃").with_font_id(FontId(1))) // where font-1 is an emoji font
    .add_text(Text::new(" world"))

iiuc the emoji won't result in a to_vertex call as it has no outline. The caller could after a draw call GlyphBrush::glyphs on the section and for each emoji ensure Font::glyph_raster_image exists in an external emoji-atlas at that scale and generate vertices for the emoji glyphs. (Actually it should still be possible to generate unified vertex data in a 1-draw pipeline this way).

I haven't tried it myself, but in this way it seems possible to render emojis correctly without any changes to _glyphbrush.

Assuming that works, it's just not ideal having to iterate over the glyphs again to figure out the emoji stuff. So perhaps it would be better if the to_vertex fn was called for all glyphs to allow generating such vertices.

In the back of my mind I'm also thinking this has parallels with supporting bitmap fonts, since they're also generally pre-baked rgba (#17).

YaelLee commented 1 year ago

If leave appropriate space for emojis somehow so an external render logic could handle them, we also need to layout and shape the text. But for different emoji fonts there will be different h_advance, i'm not sure, or maybe I'm using it the wrong way.

Section::default()
    .add_text(Text::new("😃🌅")
                  .with_font_id(FontId(1))
                  .with_scale(80.0 * window.scale_factor() as f32))

In this case, the emojis should be 80x80 px in size. But when calculating the layout,

let advance_width = scale_font.h_advance(glyph.id)

the h_advance for Apple Color Emoji is around 60px, and for NotoColorEmoji is 85px, obviously 85px is more correct. Maybe we should do some special layout for Apple Color Emoji? I'm confusing.

Also I find that glyph_brush seems has different performance among different fonts, especially in the matter of font size. I set the font to 80px, glyph_brush has the same performance with the Chrome in the font of "Times", like this

截屏2023-03-02 15 42 37

but obviously smaller than Chrome in the font of "PingFang", like this

截屏2023-03-02 15 38 47

Perhaps this is the same reason why emoji's h_advance is smaller? I'm confused and looking forward to your reply!Thanks!

alexheretic commented 1 year ago

This crate uses a different scale value PxScale which is the height in pixels. A more standard scale is point size (pt) which chrome and most other things will use.

Currently if you want to size in pt you need to convert that into the equivalent PxScale, you can use Font::pt_to_px_scale.

I'm tempted to move to pt scale by default but this is a breaking change for the crate.

YaelLee commented 1 year ago

It seems that i set the font to 80px in chrome, not the 80pt. And also set 80PxScale in glyph_brush, which multiplied with windows's scale_factor.

截屏2023-03-02 18 47 09
let scale = (80.0 * window.scale_factor() as f32).round();
glyph_brush.queue(
    Section::default()
        .add_text(
            Text::new("glyph_brush")
                .with_font_id(vec![FontId(0)])
                .with_scale(scale)
        )
        .with_screen_position(
            (30.0, 30.0)
        )
);
alexheretic commented 1 year ago

pt scales are also measured in pixels, I believe that is what chrome uses.

YaelLee commented 1 year ago

glyph_brush(80pt) vs chrome(80pt)

截屏2023-03-02 20 13 08

glyph_brush(80px) vs chrome(80px)

截屏2023-03-02 18 47 09

It seems that when i use 80pixels, glyph_brush is smaller than 80 pixels in size.

alexheretic commented 1 year ago

It just doesn't mean the same thing. PxScale is a measure of font height used to to scale glyphs in this crate. It isn't compatible with any external definition of px sizing.

YaelLee commented 1 year ago

Thanks for your reply!

I haven't tried it myself, but in this way it seems possible to render emojis correctly without any changes to glyph_brush.

Emoji has many codepoints, such as variation selector-16, skin tone modifier, zero-width joiner... and the grapheme cluster which is a sequence of codepoints that is considered a single human-perceived glyph. We could indeed only return the vertices of the emojis and renderer them through an external cargo, but perhaps the codepoints should be detected in glyph_brush first, otherwise the location of the vertex is wrong.