Closed prateekvyas1996 closed 2 months ago
Hey @prateekvyas1996! I'm here to help you out. I'm a bot designed to assist with bugs, questions, and contributions while you wait for a human maintainer. Let's tackle this issue together!
To resolve the error "Failed to resolve model_id: Could not find model id for inference server," ensure that the endpoint_url
parameter is correctly specified when creating the HuggingFaceEndpoint
instance. The endpoint_url
should include a valid model ID that exists on the Hugging Face inference server.
Here is an example of how to instantiate the HuggingFaceEndpoint
:
llm = HuggingFaceEndpoint(
endpoint_url="https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.2",
task="text-generation",
huggingfacehub_api_token="your-api-token",
model_kwargs=model_kwargs or {},
)
Make sure that the endpoint_url
you pass to the build
method includes the actual model ID. If you do not provide a valid model ID in the endpoint_url
, it could be causing the error.
Example usage:
component = HuggingFaceEndpointsComponent()
result = component.build(
input_value="Your input text here",
endpoint_url="https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.2",
task="text-generation",
huggingfacehub_api_token="your-api-token"
)
Ensure that "your-api-token"
is replaced with your actual API token.
Additionally, verify that your Hugging Face API token has the necessary permissions to access the model endpoint. If the token lacks the required permissions, you might encounter this error.
If the issue persists, you can also check the model_kwargs
parameter to ensure that any additional configuration options are correctly set. This can help resolve model ID issues by providing additional parameters to the model [1][2].
To continue talking to Dosu, mention @dosu.
Hello @prateekvyas1996, Sorry for the delay. Did you try using the new version? Does the error still persist?
Hi @prateekvyas1996 , I hope you're doing well. Just a friendly reminder that if we do not hear back from you within the next 3 days, we will close this issue. If you need more time or further assistance, please let us know.
Thank you for your understanding!
Thank you for your contribution! This issue will be closed. If you have any questions or encounter another problem, please open a new issue and we will be ready to assist you.
Got error "Error Building Component Error building vertex Hugging Face API: Failed to resolve model_id:Could not find model id for inference server: https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.2Make sure that your Hugging Face token has access to the endpoint."
I put following arguments Endpoint url: https://api-inference.huggingface.co/models/mistralai/Mistral-7B-Instruct-v0.2 Task - text-generation API token:
Still getting errors.
Originally posted by @prateekvyas1996 in https://github.com/langflow-ai/langflow/discussions/1743#discussioncomment-9662722