Simple, fast, safe, compiled language for developing maintainable software. Compiles itself in <1s with zero library dependencies. Supports automatic C => V translation. https://vlang.io
MIT License
35.5k
stars
2.15k
forks
source link
Compiler Error for Unicode escape exceeding max value inside a Raw String #21714
While playing around and testing with escape sequences, I encountered a compiler error which seems odd: An invalid escape sequence inside raw string raises an error, even though it's not supposed to be treated as an escape.
Reproduction Steps
The following line of code:
println(r'Unicode escape: \u0000004B (= K)')
raises the following compiler error:
error: unicode character exceeds max allowed value of 0x10ffff, consider using a unicode literal (\u####)
... even though it's a raw string and escape sequences inside it should be ignored.
Expected Behavior
If I change the above line to use a 32-bit Unicode escape (i.e. \U instead of \u):
println(r'Unicode escape: \U0000004B (= K)')
the compiler error goes away, and the compiled code prints:
Unicode escape: \U000004B (= K)
(as expected: the escape sequence is ignored, and not converted into the character it represents)
Current Behavior
It seems like the compiler is trying to validate escape sequences inside raw strings, even though they won't actually be expanded in the final code.
Possible Solution
No response
Additional Information/Context
No response
V version
0.4.6 4302f86
Environment details (OS name and version, etc.)
Windows 10 Home x64.
[!NOTE]
You can use the š reaction to increase the issue's priority for developers.
Please note that only the š reaction to the issue itself counts as a vote.
Other reactions and those to comments will not be taken into account.
Describe the bug
While playing around and testing with escape sequences, I encountered a compiler error which seems odd: An invalid escape sequence inside raw string raises an error, even though it's not supposed to be treated as an escape.
Reproduction Steps
The following line of code:
raises the following compiler error:
... even though it's a raw string and escape sequences inside it should be ignored.
Expected Behavior
If I change the above line to use a 32-bit Unicode escape (i.e.
\U
instead of\u
):the compiler error goes away, and the compiled code prints:
(as expected: the escape sequence is ignored, and not converted into the character it represents)
Current Behavior
It seems like the compiler is trying to validate escape sequences inside raw strings, even though they won't actually be expanded in the final code.
Possible Solution
No response
Additional Information/Context
No response
V version
0.4.6 4302f86
Environment details (OS name and version, etc.)
Windows 10 Home x64.