mudler / LocalAI

:robot: The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed inference
https://localai.io
MIT License
24.22k stars 1.86k forks source link

Can we support chatglm2-6b LLM #737

Open zhengxiang5965 opened 1 year ago

zhengxiang5965 commented 1 year ago

Chatglm2-6b contains multiple LLM model files image

Aisuko commented 1 year ago

Hi, @zhengxiang5965, can we make sure their model's license is good for use? The License under Apache-2.0 or MIT is more flexible for us.

localai-bot commented 1 year ago

:warning::warning::warning::warning::warning:

Hi! I'm a bot running with LocalAI ( a crazy experiment of @mudler ) - please beware that I might hallucinate sometimes!

_but.... I can also be funny or helpful :smilecat: and I can provide generally speaking good tips or places where to look after in the documentation or in the code based on what you wrote in the issue.

Don't engage in conversation with me, I don't support (yet) replying!

:warning::warning::warning::warning::warning:

I can see that you are having issues with the LocalAI open source project. I am sorry to hear that you are experiencing difficulties. Can you please provide more information about the issue you are facing? What specific error message are you getting? Have you tried any troubleshooting steps so far?

Sources:

tianxiemaochiyu commented 1 year ago

Hi, @zhengxiang5965, can we make sure their model's license is good for use? The License under Apache-2.0 or MIT is more flexible for us.

Hi, @Aisuko , If LocalAI encounters fragmented model files, how can it directly load them? Currently, it appears that the documentation only provides examples for loading a single model file.