-
### System Info
```shell
python 3.10.14
torch 2.4.0+cu121
optimum 1.21.4
onnx 1.16.2
onnxruntime 1.19.0
transformers 4.43.4
optim…
-
```python
for epoch in range(NUM_EPOCHS):
model.train()
for batch_idx, batch in enumerate(train_loader):
### Prepare data
input_ids = batch['input_id…
-
I ran the first command provided (to do some sanity checking of my setup since I usually get very high output errors for larger models like LLMs) and I get an output validation error.
I've made sur…
-
Hi Daniel,
Thank you so much for releasing such an awesome VLM fine-tuning notebook to the public!
I was really excited, tried the notebook out and found the following error:
![image](https:/…
-
### Question
Does transformers.js have a function to get the label after getting the logits? How to get the labels from the inference output?
let tokenizer = await AutoTokenizer.from_pretrained('d…
-
This is a great job, but I got an error when I ran, and I guess torch.nn.GELU() should be used here?
-
-
I was checking the memory consumption of RoBERTa and DistilBERT. I found there is no significant change in memory usage. Although Inference time is around **1sec** for DistilBERT and for RoBERTa is **…
-
Hello,
Going via the training.
Some small ideas for improvements.
#######################
Transformers, what can they do?
https://huggingface.co/learn/nlp-course/en/chapter1/3
A)
Curren…
-
### Describe the Question
Please provide a clear and concise description of what the question is.