Hi guys, I think it is a kind of enhancement.
At inference time in text2text models,
when the model is initialized, we check the length of the source dataset here
If someone wants to load a model into the RAM/GPU before the source text is provided, the process fails in the next steps.
I propose to move dataset length init into the getter
Hi guys, I think it is a kind of enhancement. At inference time in text2text models, when the model is initialized, we check the length of the source dataset here If someone wants to load a model into the RAM/GPU before the source text is provided, the process fails in the next steps. I propose to move dataset length init into the getter
Thanks!