Open szymonrucinski opened 5 months ago
Same error here. The documentation seems to be outdated. You need to provide an access token, even though it is not used. You can just set it to a dummy value and it will work.
"endpoints": [
{
"url": "http://127.0.0.1:8080",
"type": "llamacpp",
"accessToken": "abc"
}
]
See this line.
This seems like a minor bug to me.
i think i hardcoded my bearer token, would be a good first pr
I keep getting this error after adding LLAMA-CPP inference endpoint locally. Adding this line causes this error.
Not sure how to fix it.
Full Config: