Closed antoremin closed 1 day ago
Changing Tavily to Google search didn't solve the issue
Getting errors like this:
CancelledError()Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/langgraph/pregel/__init__.py", line 1502, in astream
async for _ in runner.atick(
File "/usr/local/lib/python3.11/site-packages/langgraph/pregel/runner.py", line 130, in atick
await arun_with_retry(t, retry_policy, stream=self.use_astream)
File "/usr/local/lib/python3.11/site-packages/langgraph/pregel/retry.py", line 102, in arun_with_retry
await task.proc.ainvoke(task.input, config)
File "/usr/local/lib/python3.11/site-packages/langgraph/utils/runnable.py", line 452, in ainvoke
input = await asyncio.create_task(coro, context=context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langgraph/utils/runnable.py", line 235, in ainvoke
ret = await asyncio.create_task(coro, context=context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/deps/data-enrichment/src/enrichment_agent/graph.py", line 62, in call_agent_model
response = cast(AIMessage, await model.ainvoke(messages))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_core/runnables/base.py", line 5349, in ainvoke
return await self.bound.ainvoke(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 305, in ainvoke
llm_result = await self.agenerate_prompt(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 794, in agenerate_prompt
return await self.agenerate(
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 720, in agenerate
results = await asyncio.gather(
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_core/language_models/chat_models.py", line 924, in _agenerate_with_cache
result = await self._agenerate(
^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/langchain_anthropic/chat_models.py", line 805, in _agenerate
data = await self._async_client.messages.create(**payload)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/resources/messages.py", line 1811, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1838, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1532, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_base_client.py", line 1552, in _request
self._platform = await asyncify(get_platform)()
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anthropic/_utils/_sync.py", line 69, in wrapper
return await anyio.to_thread.run_sync(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/to_thread.py", line 56, in run_sync
return await get_async_backend().run_sync_in_worker_thread(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2356, in run_sync_in_worker_thread
await cls.checkpoint()
File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 2264, in checkpoint
await sleep(0)
File "/usr/local/lib/python3.11/asyncio/tasks.py", line 640, in sleep
await __sleep0()
File "/usr/local/lib/python3.11/asyncio/tasks.py", line 634, in __sleep0
yield
asyncio.exceptions.CancelledError
Hi @antoremin, I am Khushi, a 4th year student at UofT CS. I’m working with my teammates @anushak18, @ashvini8, and @ssumaiyaahmed, who are also 4th year students at UofT CS. We would like to take the initiative to work on this issue and contribute to LangChain. We’re eager to help resolve the larger outputs causing streaming errors and share our findings.
Closing since the issue is now resolved. Feel free to reopen if it reappears!
Checked other resources
Example Code
Langsmith error: