Open ronneldavis opened 2 weeks ago
I found some similar issues that might help you:
Currently, it is not possible to pass headers when creating a RemoteRunnable - This issue is closed, and the solution is now documented here [1].
Custom headers for Ollama / ChatOllama - This issue was closed by a pull request that allows custom headers on Ollama / ChatOllama / OllamaEmbeddings [2].
To set custom headers in ChatOllama
using LangChain.js, you can follow the approach shown below. The OllamaCallOptions
interface supports a headers
property where you can specify your custom headers.
Here is an example:
import { ChatOllama } from "@langchain/ollama";
const model = new ChatOllama({
model: "your-model-name",
maxTokens: 1024,
clientOptions: {
headers: {
"X-Api-Key": process.env.OLLAMA_API_KEY,
"Custom-Header": "YourCustomHeaderValue",
},
},
});
const res = await model.invoke("Your query here");
console.log(res);
In this code snippet, replace "your-model-name"
with the actual model name you are using and add any custom headers you need in the headers
object. This will ensure that your custom headers are included in the requests made by the ChatOllama
instance [3][4].
To continue talking to Dosu, mention @dosu.
It appears the Ollama SDK doesn't have support for passing headers through to the requests. I've sent a message to the Ollama team asking about the best way to go about this, and will update you when I hear back.
Thanks @bracesproul! Looking forward to hearing back from you 👍🏼
@bracesproul I have been able to create a PR for ollama js that adds optional headers: https://github.com/ollama/ollama-js/pull/138
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
No response
Description
I am trying to set headers in chat ollama, I see that it had been fixed for the community library but since splitting out chat ollama, I don't see the option to set custom headers anymore.
Relevant links:
System Info