Open colinljy opened 1 year ago
yes, it's possible but would require lots of refactoring. However, it could be easier by connecting to another local endpoint service. e.g. oobabooga
https://www.reddit.com/r/LocalLLaMA/comments/15fxron/best_llama2_model_for_storytelling/
Hi, just wondering if it is possible to do something like this but for a local machine maybe with LLAMA or other LLM that is available offline without using an API call?