Model uploaded to the Hub . Task was first recognized as Text generation, but was changed to Translation following recommendations from #6 . The challenge seems to be now related with the model, as it doesn´t make inference in local .
Tried :
Change the config.json file and added the prefix comming from the language .
The inference model returns an empty string in both the Hub and when running inference.py
Context:
Model uploaded to the Hub . Task was first recognized as Text generation, but was changed to Translation following recommendations from #6 . The challenge seems to be now related with the model, as it doesn´t make inference in local .
Tried :
config.json
file and added the prefix comming from the language .The inference model returns an empty string in both the Hub and when running inference.py