Closed lubibama closed 1 year ago
you can implement BaseLLM or BaseChatModel by your self
@lubibama Sorry, I don't have the environment for glm and llama2, which prevents me from testing. Can I check if ollama meets your requirements?
I now support Ollama2, you can see OllamaExample
public static void main(String[] args) {
var llm = Ollama.builder()
.baseUrl("http://localhost:11434")
.model("llama2")
.temperature(0f)
.build()
.init();
var result = llm.predict("What is the capital of China?");
// The capital of China is Beijing.
println(result);
}
how to support other open source model,such as glm/llama2 etc. do you have this plan