Open ncoghlan opened 5 months ago
Edit: the #77102 implementation has been updated to use utf-8-sig
, so handling UTF-8 with a BOM is no longer a problem. It's just UTF-16-LE being both easy to accidentally generate and nightmarish to debug when it happens that's still a potential concern.
Experimenting a bit further, I suspect utf-8-sig
is going to be a problem, too, since it decodes cleanly with utf-8
, but gives garbage data:
>>> "text".encode("utf-8-sig").decode("utf-8")
'\ufefftext'
>>> "text".encode("utf-8-sig").decode("utf-8-sig")
'text'
>>> "text".encode("utf-8").decode("utf-8-sig")
'text'
>>>
The problem child is Windows PowerShell 5.1, where Out-File and friends default to UTF16-LE and allow UTF-8-BOM to be selected, but getting them to emit UTF-8 without a BOM is a major pain (see the voluminous essays on https://stackoverflow.com/questions/5596982/using-powershell-to-write-a-file-in-utf-8-without-the-bom as well as the PowerShell 5.1 Out-File docs)
As the examples above show, I think decoding with utf-8-sig
instead of utf-8
will solve that part of the problem.
Triage: can this be closed or is there more to do?
UTF-16-LE still fails silently (the NULLs mean decoding fails, so the file gets ignored).
The already merged PRs just made utf-8-bom work in the #77102 implementation (since that was just a change of input codec to accept UTF-8 both with and without a BOM, rather than adding an entirely new encoding to try)
Feature or enhancement
I just finished an extended Windows bug hunt that I eventually tracked down to a
.pth
file being encoded in UTF-16-LE rather than an ASCII-compatible encoding.I only figured it out by turning off frozen modules and hacking site.py to dump the output of
.pth
files as it tried to process them (at which point I checked the.pth
file encoding in VSCode and sure enough,UTF-16-LE
was down in the corner of the file window).I hit the bug by porting a Linux shell script to Windows PowerShell, not thinking about the fact that
| Out-File
on Windows Powershell 5.1 defaults to UTF-16-LE (newer versions of PowerShell that aren't the ones baked into the OS default to UTF-8 without a BOM).Given the inevitable presence of NULLs in a UTF-16-LE file, while there shouldn't be any in a UTF-8 or locale encoding file, it seems to me we should be able to handle such situations more gracefully (at the very least logging an error if NULL bytes are present in the file, but potentially even just making UTF-16-LE encoded .pth files straight up work by checking for NULL bytes, and using UTF-16-LE instead of UTF-8 and the locale encoding if we find one, or else trying UTF-16-LE before trying the locale encoding on Windows)
(This is somewhat related to #77102)
Has this already been discussed elsewhere?
No response given
Links to previous discussion of this feature:
No response
Linked PRs