Closed czhcc closed 4 months ago
To use your custom Embedding model when creating a knowledge base in Dify version 0.6.8, follow these steps:
Deploy LocalAI:
$ git clone https://github.com/go-skynet/LocalAI
$ cd LocalAI/examples/langchain-chroma
$ wget https://huggingface.co/skeskinen/ggml/resolve/main/all-MiniLM-L6-v2/ggml-model-q4_0.bin -O models/bert
$ wget https://gpt4all.io/models/ggml-gpt4all-j.bin -O models/ggml-gpt4all-j
.env
file:
$ mv .env.example .env
$ docker-compose up -d --build
$ docker logs -f langchain-chroma-api-1
Integrate the Models into Dify:
Settings > Model Providers > LocalAI
in Dify.text-embedding-ada-002
http://127.0.0.1:8080
(or the appropriate host domain if using Docker)Set the Default Model:
Settings > Model Provider
.Use the Model in Your Application:
By following these steps, you can integrate and use your custom Embedding model in Dify version 0.6.8 [1][2].
To continue the conversation, mention @dosu.
Self Checks
Dify version
0.6.8
Cloud or Self Hosted
Self Hosted (Source)
Steps to reproduce
我是本地运行,已在LocalAI中创建了Embedding 模型,但在创建知识库时不能选用。怎样在创建知识库时使用自己的Embedding 模型?
✔️ Expected Behavior
No response
❌ Actual Behavior
No response