nlp-with-transformers / notebooks

Jupyter notebooks for the Natural Language Processing with Transformers book
https://transformersbook.com/
Apache License 2.0
3.7k stars 1.13k forks source link

No loss #133

Open angelusualle opened 3 months ago

angelusualle commented 3 months ago

Information

The problem arises in chapter:

Describe the bug

I ran the


trainer = Trainer(model=model, args=training_args,
                  compute_metrics=compute_metrics,
                  train_dataset=emotions_encoded["train"],
                  eval_dataset=emotions_encoded["validation"],
                  tokenizer=tokenizer)
trainer.train();

section, and got an error complaining about no loss in the outputs:

The model did not return a loss from the inputs, only the following keys: logits. For reference, the inputs it received are input_ids,attention_mask.

Turns out AutoModelForSequenceClassification was expecting labels, not label. so I relabeled:

def relabel(batch):
    return {'labels': batch["label"]}
emotions_encoded = emotions_encoded.map(relabel)
emotions_encoded.column_names