Open arthurwolf opened 6 months ago
I try to train this model and find if we use flan-t5-large as text encoder. The GPU memory needs >32G. I change the text-encoder as flan-t5-base, it needs 32G
if i wanted to run this as a serivce what route would you reccomend? I want to host it somehere and use it as an api
Hello.
Amazing work.
What kind of hardware should I expect to need to be able to run the model?
Thank you.