0cc4m / KoboldAI

GNU Affero General Public License v3.0
150 stars 31 forks source link

Support for MythoMax-L2-13B-GPTQ #72

Open ghost opened 1 year ago

ghost commented 1 year ago

I couldn't get this model to run but it would be nice if it was possible as I prefer KoboldAI over oobabooga. MythoMax-L2-13B has 4K tokens and the GPTQ model can be run with around 8-10 gigs of VRAM so it's sort of easy to run, and it makes long responses and it is meant for roleplaying / storywriting.

Although, tutorial on how to run this with KoboldAI would work for me too.