Hi, I tried to evaluate codet5p-2b. I loaded the model from huggingface and I got an error saying CUDA out of memory, then I tried to load the model into multiple GPU cards by adding device_map = 'auto' when load the model. But I got another error: CodeT5pEncoderDecoderModel does not support device_map='auto' yet.
The same issue happens when I loaded my own finetuend codet5p-2b models.
Hi, I tried to evaluate codet5p-2b. I loaded the model from huggingface and I got an error saying CUDA out of memory, then I tried to load the model into multiple GPU cards by adding device_map = 'auto' when load the model. But I got another error: CodeT5pEncoderDecoderModel does not support device_map='auto' yet. The same issue happens when I loaded my own finetuend codet5p-2b models.