-
Do you have plans to use the new 的 tokenizers? Transformers are too slow; it takes 10 seconds to start on ax630C. If possible, could you refer to the implementation at: https://github.com/huggingface/…
-
I can't using this on my X elite PC , platform is arm ,but os is windows.
-
### Checklist
- [x] The issue exists after disabling all extensions
- [x] The issue exists on a clean installation of webui
- [ ] The issue is caused by an extension, but I believe it is caused by a …
-
Thanks for this convenient library to use HF tokenizers. It seems that the json file downloaded does get cached but could not determine its location. Is there a way to specify it via TokenizerConfig?
…
-
I want to load offline gpt2, however it can't load directly, so did I do something wrong?
code:
from autotiktokenizer import AutoTikTokenizer
tokenizer = AutoTikTokenizer.from_pretrained("home/xx…
-
**Describe the bug**
Non node.js environments may have issues building because of some packages and how they are imported.
**To Reproduce**
Make a worker, include eliza/core and attempt to b…
-
Is Transformers 4.44.0 compatible with Python 3.9.19? I receive the following error below - any thoughts are appreciated.
import transformers
from transformers import AutoTokenizer
Produces the…
-
Porting BERTTokenizers enables several text embedding generation models. Requires https://github.com/dotnet/machinelearning/issues/6988.
https://github.com/huggingface/text-embeddings-inference?ta…
-
Right now I'm using the fact that tokenizers *mostly* doesn't break Markdown syntax at a small scale but sometimes it does:
``` r
tokenizers::tokenize_sentences("blabla [ok](1.ES.2023.28.27.220080…
-
My system:
Ubuntu 22.04
4 x 4090
+---------------------------------------------------------------------------------------+
| NVIDIA-SMI 535.183.01 Driver Version: 535.183.01 CUDA Ver…