mshumer / gpt-author

MIT License
2.45k stars 354 forks source link

It is possible to run other LLM such as LLAMA locally? #12

Open colinljy opened 1 year ago

colinljy commented 1 year ago

Hi, just wondering if it is possible to do something like this but for a local machine maybe with LLAMA or other LLM that is available offline without using an API call?

fuleinist commented 9 months ago

yes, it's possible but would require lots of refactoring. However, it could be easier by connecting to another local endpoint service. e.g. oobabooga

https://www.reddit.com/r/LocalLLaMA/comments/15fxron/best_llama2_model_for_storytelling/