KoboldAI / KoboldAI-Client

For GGUF support, see KoboldCPP: https://github.com/LostRuins/koboldcpp
https://koboldai.com
GNU Affero General Public License v3.0
3.46k stars 747 forks source link

Load custom models on ColabKobold TPU #361

Closed subby2006 closed 10 months ago

subby2006 commented 1 year ago

Is it possible to edit the notebook and load custom models onto ColabKobold TPU. If so, what formats must the model be in. There are a few models listed on the readme but aren’t available through the notebook so was wondering

RecoveredApparatus commented 10 months ago

as of right now you can only load gpt-j,opt models on tpu colab anything such as llama or mistral isnt possible

henk717 commented 10 months ago

This is incorrect for Llama which can be loaded on colab on the United version. You load compatible custon models by entering their HF 16-but name into the model field. But we can not provide support for models in the safetensors format so your model needs to be in a pytorch_model.bin format.

In general the TPU is a legacy feature for us and we only try to keep it functional until colab inevitably bans our UI like they have done others. For the best experience use the GpU colab instead.