-
## Environment info
- `transformers` version: 4.4.2
- Python version: 3.6
- PyTorch version (GPU?): 1.8.0 (Tesla V100)
## Information
The problem arises when using:
```
from transformer…
-
## Environment info
AttributeError Traceback (most recent call last)
in ()
6 input = tokenizer.encode(sequence, return_tensors="pt")
7 mask_token_index …
TzurV updated
9 months ago
-
Such properties are usually expressed as separate transfromers/estimators in Apache Spark. So reconsider current implementation of these algorithm properties.
Eventually, think about shorter names …
-
# 🐛 Bug
## Information
Model I am using: bert-base-cased
Language I am using the model on (English, Chinese ...): English
The problem arises when using:
* [x] the official example scripts…
-
Hello!
I am not very much a python guy, but since my recent experiments with AI, I have gotten more used to how it works.
After following your LLaMa tutorial, I kept getting the error that the `…
-
Hi
I am trying to run text generation inference with following code. But getting error
model='meta-llama/Llama-2-7b-chat-hf'
num_shard=2
volume=$PWD/data # share a volume with the Docker container…
-
I cloned from scratch and am encountering this error:
```
Traceback (most recent call last):
File "multivers/predict.py", line 109, in
main()
File "multivers/predict.py", line 1…
-
Hi, I read your tokenizer code which is subclass of PretrainedTokenizer. But PretrainedTokenizer of paddlenlp is more similar to PretrainedTokenizerFast of transformers, which means tokenizer can retu…
-
### System Info
Hi Community!
I am using `run_clm.py` with `deepspeed` to fine-tune LIama 7B on `g5.12xlarge` EC2 instance (4 GPU, Total GPU memory 96 GB, vCPUs 48 with 192 GB).
* Transforme…
-
### Feature request
Microsoft's [unilim repository](https://github.com/microsoft/unilm/tree/db1095a693aa0d6d15bb9312cccb7f8af42b0aeb/layoutlmft/layoutlmft), which originally implements all `LayoutLM`…