Hello, before issues, thank you very much for your work!
The paper says that you can use the Llama model in addition to MPT, but when I change it to the llama model in inference, an error occurs. I thought it was not supported during the modeling process, but if you let me know if there is a new code for this or if I couldn't find it in the existing code, I would appreciate it!
Thanks for your interest in our work. Our model is based on openflamingo. Therefore, since openflamingo doesn't support the llama as the LLM, our model also can't support llama.
Hello, before issues, thank you very much for your work!
The paper says that you can use the Llama model in addition to MPT, but when I change it to the llama model in inference, an error occurs. I thought it was not supported during the modeling process, but if you let me know if there is a new code for this or if I couldn't find it in the existing code, I would appreciate it!