cocktailpeanut / dalai

The simplest way to run LLaMA on your local machine
https://cocktailpeanut.github.io/dalai
13.1k stars 1.42k forks source link

Add support for OpenAssistant LLaMa 30B SFT 6 #433

Open rcbevans opened 1 year ago

rcbevans commented 1 year ago

OpenAssistant provided a xor patch for the original LLaMa 30B model to produce a checkpoint of the model they are using on open-assistant.io.

This model feels like it performs better than the base LLaMa 30B model, so it would be great to have as an option to easily install and use via dalai, if possible.