Open Sri20021 opened 2 years ago
Yes.I am also facing timeout error.
Hi @Sri20021 @subnkve479,
We have moved away from the public API which we demoed during the workshop. Our models are currently available on ULCA and HuggingFace Spaces
Hi Sumant
I tried to use through huggingface space.I can do translations using below link on web browser
https://huggingface.co/spaces/ai4bharat/IndicTrans-Indic2English
But to use the model at my work I want to load the same.I tried below but facing errors
translator=pipeline('translation',model='ai4bharat/IndicTrans-Indic2English')
I am getting RepositoryNotFoundError .
OSError: ai4bharat/IndicTrans-Indic2English is not a local folder and is not a valid model identifier listed on 'https://huggingface.co/models'
If this is a private repository, make sure to pass a token having permission to this repo with use_auth_token
or log in with huggingface-cli login
and pass use_auth_token=True
.
ie. I am not able to access ai4bharath IndicTrans model through hugging face pipelines. Can you please help me if I am missing something
Regards Subbu
@sumanthd17 @GokulNC Can you please resolve my above query
Thanks Sumanth. @subnkve479 I am able to run using this link: https://hf.space/embed/ai4bharat/IndicTrans-English2Indic/api. However this is limited to single string translation. @sumanthd17 Is there a way to do batch Translation like the earlier hosted public api using hugging face spaces models?
We do not have a freely hosted API for machine translation, since it is very costly to deploy it for large-scale public consumption. Since the codebase & models are open-source, we recommend you go ahead and host them yourself for your private purposes.
I am not able to access ai4bharath IndicTrans model through hugging face pipelines
Yes, since the indicTrans model is using fairseq, it is not yet converted to HuggingFace format, so not available as a model on HF. (The HF spaces is just hosted using the fairseq model as it is, for demo purposes.)
@GokulNC Thanks fore reply. I will try to host it with open source code. Any information on when indicTrans model will be available in HF ?
We do not have a freely hosted API for machine translation, since it is very costly to deploy it for large-scale public consumption. Since the codebase & models are open-source, we recommend you go ahead and host them yourself for your private purposes.
I am not able to access ai4bharath IndicTrans model through hugging face pipelines
Yes, since the indicTrans model is using fairseq, it is not yet converted to HuggingFace format, so not available as a model on HF. (The HF spaces is just hosted using the fairseq model as it is, for demo purposes.)
Hi, What is the minimum and recommended hardware specification to deploy ai4bharath IndicTrans model.?
We have not tested the absolute minimum but it should work without any issue on 16gb cards ranging from K80 to A100
We have not tested the absolute minimum but it should work without any issue on 16gb cards ranging from K80 to A100
Can it be deployed on non-GPU systems?
The batch translation api is throwing a time-out error from today morning. Could you please fix this issue?