Open LotusPhilip opened 2 weeks ago
@ZiTao-Li @FredericW @xieyxclack Please check if RAG can be supported by ollama.
I believe that's compatible. You can refer to this to compose your embedding model config, and replace it with the dashscope one.
I have already tried with ollama, but it failed. It shows that
RuntimeError: Model Wrapper [OllamaEmbeddingWrapper] doesn't need to format the input. Please try to use the model wrapper directly.
So do you mean this example is better achieved with Dashscope?
I have already tried with ollama, but it failed. It shows that
RuntimeError: Model Wrapper [OllamaEmbeddingWrapper] doesn't need to format the input. Please try to use the model wrapper directly.
So do you mean this example is better achieved with Dashscope?
My conjecture is that probably the OllamaEmbeddingWrapper is used as a text-generation model in your config. (The RAG example requires two models, one is embedding model and the other is text-generation model.) Maybe you can double-check this and let us know if there is still problem.
Thanks for your help. Actually, I introduce LlamaIndexAgent to be dialog agent. However, I still have no idea that how to let a dialog agent answer questions based on the embedded knowledge. Can I get any tutorials or examples?
Can RAG agent example be achieved with ollama embedding?