LAION-AI / Open-Assistant

OpenAssistant is a chat-based assistant that understands tasks, can interact with third-party systems, and retrieve information dynamically to do so.
https://open-assistant.io
Apache License 2.0
37k stars 3.23k forks source link

I would like to run the model: oasst-sft-6-llama-30b locally but i don't seem able too #2779

Closed sancassino closed 1 year ago

sancassino commented 1 year ago

For developing purposes I would like to run the model: oasst-sft-6-llama-30b locally but i don't seem able too. As I believe I can run the Pythia models as a chatbot locally on my computer, but those models are just way more small I believe and not as advanced.

Also because I want it to be fluent in Spanish and I can see the Pythia models are struggling with speaking Spanish. As you guys say you are open source, everything should be open source right? If it is for commercial purposes or educational purposes.

Thank you!

olliestanley commented 1 year ago

It is not legal for OA to release the entire LLaMa models due to Meta AI's license on LLaMa. Instead weight deltas will be released soon, you will just need to wait a little longer

sancassino commented 1 year ago

Oh okay, thank you for your answer. And if I can ask, as I am just starting with developing/machine learning, will you guys give out like a (readme) or a step plan on how to integrate/work with these weight deltas and how to run the chatbot locally? Because I also feel right now for the Pythia models it would be amazing if you guys would help more into running it locally. :)

olliestanley commented 1 year ago

Oh okay, thank you for your answer. And if I can ask, as I am just starting with developing/machine learning, will you guys give out like a (readme) or a step plan on how to integrate/work with these weight deltas and how to run the chatbot locally? Because I also feel right now for the Pythia models it would be amazing if you guys would help more into running it locally. :)

The steps for running LLaMa locally when the deltas are released will be the same as running other LLaMa models locally, so we will likely link to some existing instructions instead of maintaining our own. If you have sufficient hardware, you can follow the instructions of an existing solution like this to run the Pythia model locally.

Daryl149 commented 1 year ago

Looks like they were temporarily available, but appear to have been taken offline? https://github.com/LAION-AI/Open-Assistant/pull/2829#discussion_r1174308331

olliestanley commented 1 year ago

Looks like they were temporarily available, but appear to have been taken offline? #2829 (comment)

We released a version yesterday but the decode process didn't work for everyone, so we took it offline and uploaded a new version today with a corrected decode process. It is available here

Daryl149 commented 1 year ago

Thanks! currently converting :D