Closed chris-chipstack closed 1 month ago
Try to add your huggingFace token huggingfacehub_api_token = "hf-..."
llm = HuggingFaceEndpoint(
repo_id=repo_id, max_length=128, temperature=0.5, huggingfacehub_api_token='hf_...'
)
To get your HF token: https://huggingface.co/settings/tokens
@elfailali The point is that using a local hosted model, there is no reason why I should require a token since there is no dependency on hugging face.
https://github.com/langchain-ai/langchain/pull/22365 fixed this though, thank you @mirkenstein
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
I am using a self hosted model on a
text-generation-inference
container and was looking to update from usingHuggingFaceTextGenInference
since it is marked as deprecated. Unfortunately, this validator errors out even though I am hitting a self hosted container. This is working when I use the deprecated class, since it doesn't naively check environment variables.System Info