xhluca / dl-translate

Library for translating between 200 languages. Built on 🤗 transformers.
https://xhluca.github.io/dl-translate/
MIT License
451 stars 47 forks source link

Include an option to use batch_size #9

Closed xhluca closed 3 years ago

xhluca commented 3 years ago

Right now the model will try to run generate on the entire input list at once, which could result in the device running out of memory. to avoid that, we could allow a batch_size (or chunk_size) parameter to the translate function.

By default it'd be None which would then attempt to run generate on everything. If an integer > 1 is given, it will create batch sizes of that size and iteratively generate the translations.