langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications
https://python.langchain.com
MIT License
89.78k stars 14.19k forks source link

HuggingFaceEndpoint requires a HuggingFace API key even when using self hosted models #19685

Closed chris-chipstack closed 1 month ago

chris-chipstack commented 4 months ago

Checked other resources

Example Code

from langchain_community.llms import HuggingFaceEndpoint
llm=HuggingFaceEndpoint(
    endpoint_url=os.environ.get("TGI_API_URL"),
    streaming=True
)

Error Message and Stack Trace (if applicable)

  File "/home/chris/Code/server/app/clients/llm_client.py", line 90, in get_text_generation_inference_llm_client
    llm=HuggingFaceEndpoint(
        ^^^^^^^^^^^^^^^^^^^^
  File "/home/chris/Code/server/venv/lib/python3.12/site-packages/langchain_core/load/serializable.py", line 120, in __init__
    super().__init__(**kwargs)
  File "/home/chris/Code/server/venv/lib/python3.12/site-packages/pydantic/v1/main.py", line 341, in __init__
    raise validation_error
pydantic.v1.error_wrappers.ValidationError: 1 validation error for HuggingFaceEndpoint
__root__
  Could not authenticate with huggingface_hub. Please check your API token. (type=value_error)

Description

I am using a self hosted model on a text-generation-inference container and was looking to update from using HuggingFaceTextGenInference since it is marked as deprecated. Unfortunately, this validator errors out even though I am hitting a self hosted container. This is working when I use the deprecated class, since it doesn't naively check environment variables.

System Info

System Information
------------------
> OS:  Linux
> OS Version:  #34-Ubuntu SMP PREEMPT_DYNAMIC Mon Feb  5 18:29:21 UTC 2024
> Python Version:  3.12.2 (main, Mar 14 2024, 15:39:50) [GCC 11.4.0]

Package Information
-------------------
> langchain_core: 0.1.33
> langchain: 0.1.12
> langchain_community: 0.0.29
> langsmith: 0.1.31
> langchain_text_splitters: 0.0.1

Packages not installed (Not Necessarily a Problem)
--------------------------------------------------
The following packages were not found:

> langgraph
> langserve
elfailali commented 1 month ago

Try to add your huggingFace token huggingfacehub_api_token = "hf-..."

llm = HuggingFaceEndpoint(
    repo_id=repo_id, max_length=128, temperature=0.5, huggingfacehub_api_token='hf_...'
)

To get your HF token: https://huggingface.co/settings/tokens

chris-chipstack commented 1 month ago

@elfailali The point is that using a local hosted model, there is no reason why I should require a token since there is no dependency on hugging face.

https://github.com/langchain-ai/langchain/pull/22365 fixed this though, thank you @mirkenstein