Closed wkkautas closed 2 years ago
@wkkautas Yes, that's what we should do for large models > 2 Gb (we should set the load_external_data to True). Thank you for this issue.
@wkkautas can you please put the name of the model that you were testing and you got this error ?
Thank you for the confirmation and fix! I was using xlm-roberta-large.
Thank you for the excellent work!
When using convert_model command in ghcr.io/els-rd/transformer-deploy:0.5.1 to convert models over 2GB, the following error occurs.
Though editing following line to set
load_external_data=True
solves this error, is it right workaround?https://github.com/ELS-RD/transformer-deploy/blob/cc781dbe925cccdc309e0a96501dc20b979b4627/src/transformer_deploy/backends/pytorch_utils.py#L168
Thanks,