Open AtmehEsraa opened 2 months ago
Hello,
I notice you are importing ChatOllama
from langchain_ollama
, which is correct, but later on import and use OllamaFunctions
from langchain_experimental
. OllamaFunctions
is deprecated and you should be getting warning messages directing you to langchain_ollama.ChatOllama
.
Could you please use ChatOllama
, and if this does not solve the problem, provide a minimal reproducible example that we can use to debug? This should be a short snippet that someone can copy/paste and immediately run (e.g., any tools necessary to trigger the bug should be defined, and superfluous abstractions should be cut out).
@ccurme I want to use customer agent support example but i facing a lot of issues the first one the tool calling enter in a loop without a response
Checked other resources
Example Code
from langchain_anthropic import ChatAnthropic from langchain_community.tools.tavily_search import TavilySearchResults from langchain_core.prompts import ChatPromptTemplate from langchain_core.runnables import Runnable, RunnableConfig from langchain_ollama import ChatOllama
class Assistant: def init(self, runnable: Runnable): self.runnable = runnable
from langchain_openai import ChatOpenAI from langchain_experimental.llms.ollama_functions import OllamaFunctions llm = OllamaFunctions(model="llama3:8b-instruct-fp16", format="json", temperature=0)
primary_assistant_prompt = ChatPromptTemplate.from_messages( [ ( "system", "You are a helpful customer support assistant for Swiss Airlines. " " Use the provided tools to search for flights, company policies, and other information to assist the user's queries. " " When searching, be persistent. Expand your query bounds if the first search returns no results. " " If a search comes up empty, expand your search before giving up." "\n\nCurrent user:\n\n{user_info}\n "
"\nCurrent time: {time}.",
),
("placeholder", "{messages}"),
]
).partial(time=datetime.now())
part_1_tools = [ fetch_user_flight_information, search_flights, update_ticket_to_new_flight, cancel_ticket, book_ticket, book_hotel, find_flights
] part_1_assistant_runnable = primary_assistant_prompt | llm.bind_tools(part_1_tools) import shutil import uuid
tutorial_questions = [ "Is my flight from Amsterdam to Oslo on time?" , "Can you tell me the flight status for my trip to Oslo?", "What time is my flight from Amsterdam scheduled to depart?", "When is my flight supposed to arrive in Oslo?", ]
Copy with the backup file so we can restart from the original place in each section
shutil.copy(backup_file, db)
thread_id = str(uuid.uuid4()) configjson = { "configurable": { "passenger_id": "3952 666242", "thread_id": thread_id, "flight_id": 30575, "ticket_no":""
}
_printed = set() for question in tutorial_questions: thread_id = str(uuid.uuid4()) configjson["configurable"]["thread_id"] = thread_id events = part_1_graph.stream( {"messages": ("user", question)}, configjson, stream_mode="values" )
Error Message and Stack Trace (if applicable)
Description
ValueError: Received unsupported message type for Ollama.
System Info
%pip install git+https://github.com/langchain-ai/langchain.git\#egg=langchain-experimental\&subdirectory=libs/experimental %pip install -U langchain_ollama langgraph==0.1.19 langchain-community langchain-anthropic tavily-python pandas openai langchain_openai !curl -fsSL https://ollama.com/install.sh | sh