Open emanuelshalev opened 3 months ago
for inference, you can directly use model.forward() to get the predictions dict, if your input is tensor
otherwise - if input is test dataset, then you can do model.predict(). All our APIs are huggingface based and used the standard predict and forward calls..
The tutorial shows how to run evaluation but is missing a bare bone example of how to use the model to run inference .