langchain-ai / langchain

šŸ¦œšŸ”— Build context-aware reasoning applications
https://python.langchain.com
MIT License
92k stars 14.64k forks source link

ChatHuggingFace does not implement `bind_tools` #21352

Open Travis-Barton opened 4 months ago

Travis-Barton commented 4 months ago

Checked other resources

Example Code

from langchain_community.llms import HuggingFaceEndpoint from langchain_community.chat_models import ChatHuggingFace llm = HuggingFaceEndpoint( repo_id="meta-llama/Meta-Llama-3-8B-Instruct", task="text-generation", max_new_tokens=1000, top_k=30, temperature=0.1, repetition_penalty=1.03, huggingfacehub_api_token=get_secrets()['llm_api_keys']['HUGGINGFACE_API_TOKEN'], ) return ChatHuggingFace(llm=llm)

Error Message and Stack Trace (if applicable)

Traceback (most recent call last): File "/Applications/PyCharm.app/Contents/plugins/python/helpers/pydev/pydevconsole.py", line 364, in runcode coro = func() File "", line 15, in File "/Users/travisbarton/opt/anaconda3/envs/TriviaGPT_dashboards_and_cloud_functions/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 912, in bind_tools raise NotImplementedError() NotImplementedError

Description

But this doesn't leave us any way to use this chat model and return a structured output. It also breaks our back-end which requires that the model be in a ChatModel format. Is there any plan to update this?

System Info

System Information

OS: Darwin OS Version: Darwin Kernel Version 23.4.0: Fri Mar 15 00:12:49 PDT 2024; root:xnu-10063.101.17~1/RELEASE_ARM64_T6020 Python Version: 3.10.14 (main, Mar 21 2024, 11:24:58) [Clang 14.0.6 ]

Package Information

langchain_core: 0.1.50 langchain: 0.1.17 langchain_community: 0.0.36 langsmith: 0.1.53 langchain_anthropic: 0.1.11 langchain_groq: 0.1.3 langchain_openai: 0.1.5 langchain_text_splitters: 0.0.1

Packages not installed (Not Necessarily a Problem)

The following packages were not found:

langgraph langserve

shakibaT commented 4 months ago

I have the same error:

`NotImplementedError Traceback (most recent call last) Cell In[113], line 69 34 primary_assistant_prompt = ChatPromptTemplate.from_messages( 35 [ 36 ( (...) 46 ] 47 ).partial(time=datetime.now()) 49 part_1_tools = [ 50 TavilySearchResults(max_results=1), 51 fetch_user_flight_information, (...) 67 cancel_excursion, 68 ] ---> 69 part_1_assistant_runnable = primary_assistant_prompt | llm.bind_tools(part_1_tools)

File c:\nlp\GenAI_course\langGraph\venv\Lib\site-packages\langchain_core\language_models\chat_models.py:912, in BaseChatModel.bind_tools(self, tools, kwargs) 907 def bind_tools( 908 self, 909 tools: Sequence[Union[Dict[str, Any], Type[BaseModel], Callable, BaseTool]], 910 kwargs: Any, 911 ) -> Runnable[LanguageModelInput, BaseMessage]: --> 912 raise NotImplementedError()

NotImplementedError:`

saptarshi091 commented 3 months ago

I think this has been solved. Try updating it to the latest version