Open MartinSStewart opened 1 month ago
What would the implementation look like? Char -> number of bytes it takes to represent it in utf8/16?
Actually I'm not sure. I started trying to implement this inline in the compiler and was unable to get it to work. Node uses UTF16 internally but when I tried implementing that it didn't work. The compiler expects getCharWidth 'A'
to return 1 for example. But it doesn't seem to be the UTF8 char widths either? I'm less sure of this because I could have screwed up the UTF8 byte width implementation.
Might be useful for https://github.com/guida-lang/compiler/issues/14