Open randomdude999 opened 10 months ago
Hi @randomdude999. Thanks for the report. This issue would need GF Eng team revision to provide a more informed answer.
For the time being, I've "🐦⬛" to produce the "blackbird" emoji on Mac; these are the findings.
Chrome - WORKS OK
Version 120.0.6099.109 (Official Build) (x86_64)
Safari and Firefox - NOT WORKING
Version 15.6.1 (17613.3.9.1.16)
and 121.0.1 (64 bit)
respectively
Currently, due to implementation limitations, Firefox matches fonts codepoint-by-codepoint, not by grapheme cluster. (Safari also seems to have the same behavior, though I don't have a mac to test that). This causes problems with Google Fonts's subsetter, which sometimes puts different grapheme clusters that start with the same codepoint into different fonts. I specifically noticed this with Noto Color Emoji (I'm not sure how exactly the subsetter works as it doesn't seem to be public so I can't tell if this could happen with other fonts too), in https://fonts.googleapis.com/css2?family=Noto+Color+Emoji. For example, the Bird emoji (U+1F426) is in font file [10], while Black Bird (U+1F426 U+200D U+2B1B) is in file [7]. When matching the black bird emoji sequence, Firefox will look through the font files in reverse order (as specified by the CSS Fonts spec), and find the bird in file [10]. It will thus never consider the grapheme cluster in file [7] as an option and fail to render the black bird properly, falling back to a regular bird and black square.
My proposed fix would be to make sure that the subsetter always puts all grapheme clusters that start with the same codepoint into the same subset. This would mean that font-matching by the first codepoint, which Firefox and apparently also Safari do, will always find the font file with the correct glyph.
See also: the Firefox bug I made regarding this issue (but since this is blocked on a 14-year-old open issue, and also appears in Safari, it seems easier to fix in Google Fonts).