Closed zmccormick7 closed 5 months ago
I'm not sure what the best way to do this is, because there are many ways to run LLMs locally, but Llama 3-8B should be a great choice for AutoContext.
I'm not sure what the best way to do this is, because there are many ways to run LLMs locally, but Llama 3-8B should be a great choice for AutoContext.