I am using a pretrain adapter with deepspeed --pretrain_mm_mlp_adapter /home/srikanth/api-webapp/checkpoints/llava-v1.5-llama-3-8b-pretrain/mm_projector.bin
but this throws an error "AttributeError: 'PreTrainedTokenizerFast' object has no attribute 'legacy'"
The pretrained adapter was not created with the recent version. How do i use the earlier pre-trained adapter here?
I am using a pretrain adapter with deepspeed --pretrain_mm_mlp_adapter /home/srikanth/api-webapp/checkpoints/llava-v1.5-llama-3-8b-pretrain/mm_projector.bin
but this throws an error "AttributeError: 'PreTrainedTokenizerFast' object has no attribute 'legacy'"
The pretrained adapter was not created with the recent version. How do i use the earlier pre-trained adapter here?