Closed Dicomsky closed 1 year ago
We haven't experimented with the multimodal chatbot on the 7B model. However, we are trying to accelerate the training and save GPU memory, looking forward our updates.
Thank you for your reply!
Any update on this? the GPU requirement 55G for 13B chatbot is not affordable on many GPU types. Would be glad if you could release a 7B chatbot checkpoint or reduce the memory usage to under 40G, for the chatbot experiment. Thanks.
Would you mind releasing LaVin-7B model on task Multimodal ChatBot? I'm interested in your work but I have 4 3090Ti and no enough resourse to fully train a LaVin-7B model, and lavin-13B is too large.