-
## Steps to Reproduce
1. Notice mono_unicode_to_external, mono_utf8_from_external in the code.
2. Seems like a mess and maybe not consistent.
We use it for filenames, at least sometimes, for comm…
-
It'd be cool if we could customise how encodings are presented in source issues.
E.g. for `testground`, I think the issue (https://github.com/testground/testground/issues/1491) would look better i…
-
### Checks
- [X] I have checked that this issue has not already been reported.
- [X] I have confirmed this bug exists on the [latest version](https://pypi.org/project/polars/) of Polars.
### Reprodu…
-
I use LVGL 9.1.1 version. After using the official online and offline font conversion tools to convert a Chinese character, the demo example shows garbled characters on the LCD. I don't know where the…
-
Currently some (or all) of the endpoints rely on hard-coded encoding values for rasdaman coverages. I think that a more robust approach would be to ensure that encodings are present in the coverage me…
-
Hi:
I tried QAT on a model and exported the encodings. Then, I used the qnn-onnx-converter with --quantization_overrides and --input_list trying to put min/max/scale value after QAT into the converte…
-
The `instrsxarch.h` encodings are currently ordered as follows:
* R/M[reg]
* R/M,icon
* reg,R/M
* eax,i32
* register
This ordering appears to have been done in order to save space, as not all …
-
It looks like you're using the `p50k_base` encodings which are only for GPT-2 and GPT-3. GPT-4 uses `cl100k_base` encodings.
The newer vocab is available [here](https://openaipublic.blob.core.wind…
-
```
Handling character encodings is a very common part of data cleansing for me.
At the basic level, this means giving the option of specifying a character
encoding on both import and export. Goin…
-
The initial implementation only handles single byte character encodings. For version 2.0, it would be nice to have support for multibyte character encodings.
Implementing support for fixed width mult…