Mintplex-Labs / anything-llm

The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, and more.
https://anythingllm.com
MIT License
27.9k stars 2.83k forks source link

[FEAT]: Enable multi-file model import #2396

Open mtomas7 opened 2 months ago

mtomas7 commented 2 months ago

What would you like to see?

These days some of the organizations split their model files (I assume for the easier download), eg. Qwen2.5 is split into 3.5GB parts. I could not import this kind of model, perhaps I didn't know how to do?

Thank you!

timothycarambat commented 2 months ago

Cannot be supported until https://github.com/ollama/ollama/issues/5245 is closed & merged

mtomas7 commented 2 months ago

For the time being, we can use this workaround: https://github.com/ollama/ollama/issues/5245#issuecomment-2305577747