Open zhengxiang5965 opened 1 year ago
Hi, @zhengxiang5965, can we make sure their model's license is good for use? The License under Apache-2.0
or MIT
is more flexible for us.
Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!
_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.
Don't engage in conversation with me, I don't support (yet) replying!
I can see that you are having issues with the LocalAI open source project. I am sorry to hear that you are experiencing difficulties. Can you please provide more information about the issue you are facing? What specific error message are you getting? Have you tried any troubleshooting steps so far?
Sources:
Hi, @zhengxiang5965, can we make sure their model's license is good for use? The License under
Apache-2.0
orMIT
is more flexible for us.
Hi, @Aisuko , If LocalAI encounters fragmented model files, how can it directly load them? Currently, it appears that the documentation only provides examples for loading a single model file.
Chatglm2-6b contains multiple LLM model files