Closed remisharrock closed 3 months ago
I will set two default models on ollama in the default config but you are free to change them as below
model: 'nous-hermes2',
embedModel: 'all-minilm'
const ai = axAI('ollama', { model: 'nous-hermes2' , embedModel: 'mxbai-embed-large' });
I'm submitting a ... [ ] bug report [ ] feature request [ ] question about the decisions made in the repository [X ] question about how to use this project
Summary I modified the example vectordb.ts to use ollama:
but got this error:
Other information (e.g. detailed explanation, stack traces, related issues, suggestions how to fix, links for us to have context, eg. StackOverflow, personal fork, etc.)
Indeed in ollama i have only one model: 'llama3:latest' installed. Should I install an embedding model like explained here https://ollama.com/blog/embedding-models
But then how do I configure this model in here:
thanks for the help!