-
**Before submitting a bug, please make sure the issue hasn't been already addressed by searching through the [FAQs](https://ai.meta.com/llama/faq/) and [existing/past issues](https://github.com/facebo…
-
```python
>>> tokenizer = AutoTokenizer.from_pretrained("meta-llama/Meta-Llama-3-8B")
>>> tokenizer("hello !")
{'input_ids': [128000, 15339, 758], 'attention_mask': [1, 1, 1]}
>>> tokenizer.decode…
-
### **Error code:**
RuntimeError Traceback (most recent call last)
[](https://localhost:8080/#) in ()
----> 1 model.save_pretrained_gguf("model", tokenizer,)
1 fra…
-
There's a bug in attack_manager.py:
```
if self.conv_template.name == 'llama-2':
self.conv_template.messages = []
self.conv_template.append_message(self.conv_template.roles…
-
when submit the content by pressing the button,(run web-demo.py), the fault occur。
error content:
File "c:\Users\lzj_r\chatglm.cpp\chatglm_cpp\__init__.py", line 68, in chat
input_ids = self.…
lzj-r updated
19 hours ago
-
# Prerequisites
Please answer the following questions for yourself before submitting an issue.
- [X] I am running the latest code. Development is very rapid so there are no tagged versions as of…
-
I need to use **tokenizer.json** in my project, how should I create it?
-
### System Info
- `transformers` version: 4.42.3
- Platform: Linux-6.1.85+-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.23.4
- Safetensors version: 0.4.3
- Accele…
-
从Hugging Face使用了guwenbert,但是tokenization的结果仅仅是把一个句子分成一个个中文字符。想了解一下这是正常的吗。谢谢!
```
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('ethanyt/guwenbert-base')
te…
-
Hi guys, thanks for open-sourcing this great work!
It seems LLama3 is using “right” padding and using “eos_token“ as the “padding_token”. Could you help verify that if I want to train this model, wh…