Open minump opened 8 months ago
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: 9feb46e4-446e-4982-9b6c-4e91e1428326).
Error in handle_issue_opened: GitHubAction._run() missing 1 required positional argument: 'instructions' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 102, in handle_issue_opened
result = bot.run(prompt)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 44, in run
result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input": f"{input}"}, {
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/schema/runnable/base.py", line 2316, in invoke
return self.bound.invoke(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 84, in invoke
return self(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
response = self.executor.step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
response = self.chain.run(**inputs, callbacks=callbacks)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 506, in run
return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 1141, in _call
next_step_output = self._take_next_step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 991, in _take_next_step
observation = tool.run(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 364, in run
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 336, in run
self._run(*tool_args, run_manager=run_manager, **tool_kwargs)
TypeError: GitHubAction._run() missing 1 required positional argument: 'instructions'
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: 2930a61c-ff42-430c-b311-f3779dc56e60).
Error in handle_issue_opened: 'NoneType' object is not subscriptable Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 110, in handle_issue_opened
ray.get(post_comment.remote(issue_or_pr=issue, text=str(result['output']), time_delay_s=0))
TypeError: 'NoneType' object is not subscriptable
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: ef5765e1-5b4a-43e1-bc83-6b3806d6efd4).
Error in handle_issue_opened: 'NoneType' object is not subscriptable Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 110, in handle_issue_opened
ray.get(post_comment.remote(issue_or_pr=issue, text=str(result['output']), time_delay_s=0))
TypeError: 'NoneType' object is not subscriptable
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: load_chat_planner() got an unexpected keyword argument 'callbacks' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 98, in handle_issue_opened
bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id, image_name=image_name)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 41, in __init__
self.agent = self.make_agent()
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 55, in make_agent
planner = load_chat_planner(self.llm, system_prompt=hub.pull("kastanday/ml4bio-rnaseq-planner").format( user_info=get_user_info_string),
TypeError: load_chat_planner() got an unexpected keyword argument 'callbacks'
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: load_chat_planner() got an unexpected keyword argument 'callbacks' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 98, in handle_issue_opened
bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id, image_name=image_name)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 41, in __init__
self.agent = self.make_agent()
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 55, in make_agent
planner = load_chat_planner(self.llm, system_prompt=hub.pull("kastanday/ml4bio-rnaseq-planner").format( user_info=get_user_info_string))
TypeError: load_chat_planner() got an unexpected keyword argument 'callbacks'
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: load_chat_planner() got an unexpected keyword argument 'callbacks' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 98, in handle_issue_opened
bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id, image_name=image_name)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 41, in __init__
self.agent = self.make_agent()
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 55, in make_agent
planner = load_chat_planner(self.llm, system_prompt=hub.pull("kastanday/ml4bio-rnaseq-planner").format( user_info=get_user_info_string))
TypeError: load_chat_planner() got an unexpected keyword argument 'callbacks'
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: load_chat_planner() got an unexpected keyword argument 'callbacks' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 98, in handle_issue_opened
bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id, image_name=image_name)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 41, in __init__
self.agent = self.make_agent()
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 55, in make_agent
planner = load_chat_planner(self.llm, system_prompt=hub.pull("kastanday/ml4bio-rnaseq-planner").format(user_info = get_user_info_string))
TypeError: load_chat_planner() got an unexpected keyword argument 'callbacks'
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: GitHubAction._run() missing 1 required positional argument: 'instructions' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 99, in handle_issue_opened
result = bot.run(prompt)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 44, in run
result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input": f"{input}"}, {
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/schema/runnable/base.py", line 2316, in invoke
return self.bound.invoke(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 84, in invoke
return self(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
response = self.executor.step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
response = self.chain.run(**inputs, callbacks=callbacks)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 506, in run
return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 1141, in _call
next_step_output = self._take_next_step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 991, in _take_next_step
observation = tool.run(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 364, in run
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 336, in run
self._run(*tool_args, run_manager=run_manager, **tool_kwargs)
TypeError: GitHubAction._run() missing 1 required positional argument: 'instructions'
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: 7afec1b8-8840-41b7-b16b-fbe0f274bb1f).
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: GitHubAction._run() missing 1 required positional argument: 'instructions' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 99, in handle_issue_opened
result = bot.run(prompt)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 44, in run
result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input": f"{input}"}, {
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/schema/runnable/base.py", line 2316, in invoke
return self.bound.invoke(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 84, in invoke
return self(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
response = self.executor.step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
response = self.chain.run(**inputs, callbacks=callbacks)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 506, in run
return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 1141, in _call
next_step_output = self._take_next_step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 991, in _take_next_step
observation = tool.run(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 364, in run
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 336, in run
self._run(*tool_args, run_manager=run_manager, **tool_kwargs)
TypeError: GitHubAction._run() missing 1 required positional argument: 'instructions'
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: 761f5409-5434-4f9a-9f95-b93e1597be8a).
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: WorkflowAgent.custom_load_agent_executor() got multiple values for argument 'verbose' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 98, in handle_issue_opened
bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id, image_name=image_name)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 59, in __init__
self.agent = self.make_agent()
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 115, in make_agent
executor = self.custom_load_agent_executor(self.llm, tools, verbose=True, callbacks=[self.callback_handler], handle_parsing_errors=True)
TypeError: WorkflowAgent.custom_load_agent_executor() got multiple values for argument 'verbose'
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: WorkflowAgent.custom_load_agent_executor() got multiple values for argument 'verbose' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 98, in handle_issue_opened
bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id, image_name=image_name)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 59, in __init__
self.agent = self.make_agent()
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 115, in make_agent
executor = self.custom_load_agent_executor(self.llm, tools, verbose=True, callbacks=[self.callback_handler], handle_parsing_errors=True)
TypeError: WorkflowAgent.custom_load_agent_executor() got multiple values for argument 'verbose'
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: e10763d4-7bd1-417d-af01-521c4bb0bed7).
Error in handle_issue_opened: GitHubAction._run() missing 1 required positional argument: 'instructions' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 99, in handle_issue_opened
result = bot.run(prompt)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 62, in run
result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input": f"{input}"}, {
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/schema/runnable/base.py", line 2316, in invoke
return self.bound.invoke(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 84, in invoke
return self(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
response = self.executor.step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
response = self.chain.run(**inputs, callbacks=callbacks)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 506, in run
return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 1141, in _call
next_step_output = self._take_next_step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 991, in _take_next_step
observation = tool.run(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 364, in run
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 336, in run
self._run(*tool_args, run_manager=run_manager, **tool_kwargs)
TypeError: GitHubAction._run() missing 1 required positional argument: 'instructions'
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: unhashable type: 'ClickTool' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 98, in handle_issue_opened
bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id, image_name=image_name)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 59, in __init__
self.agent = self.make_agent()
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 108, in make_agent
tools = load_tools(get_tools(), callbacks=[self.callback_handler])
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/load_tools.py", line 463, in load_tools
elif name in _BASE_TOOLS:
TypeError: unhashable type: 'ClickTool'
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: 631257c7-96d7-4204-a43a-115f8869191e).
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: GitHubAction._run() got an unexpected keyword argument 'file_path' Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 99, in handle_issue_opened
result = bot.run(prompt)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 62, in run
result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input": f"{input}"}, {
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/schema/runnable/base.py", line 2316, in invoke
return self.bound.invoke(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 84, in invoke
return self(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
response = self.executor.step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
response = self.chain.run(**inputs, callbacks=callbacks)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 506, in run
return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 1141, in _call
next_step_output = self._take_next_step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 991, in _take_next_step
observation = tool.run(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 364, in run
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 336, in run
self._run(*tool_args, run_manager=run_manager, **tool_kwargs)
TypeError: GitHubAction._run() got an unexpected keyword argument 'file_path'
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: 7a9e9822-083a-4d28-b772-979a113fbe4f).
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: 1 validation error for GitHubAPIWrapper callbacks extra fields not permitted (type=value_error.extra) Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 98, in handle_issue_opened
bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id, image_name=image_name)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 59, in __init__
self.agent = self.make_agent()
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 108, in make_agent
tools = get_tools(callback=self.callback_handler)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/tools.py", line 59, in get_tools
github = GitHubAPIWrapper(callbacks=[callback]) # type: ignore
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for GitHubAPIWrapper
callbacks
extra fields not permitted (type=value_error.extra)
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: 8f20446f-419b-4160-b3f7-d7b614176c01).
Error in handle_issue_opened: 'NoneType' object is not subscriptable Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 110, in handle_issue_opened
ray.get(post_comment.remote(issue_or_pr=issue, text=str(result['output']), time_delay_s=0))
TypeError: 'NoneType' object is not subscriptable
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'> Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 99, in handle_issue_opened
result = bot.run(prompt)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 62, in run
result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input": f"{input}"}, {
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/schema/runnable/base.py", line 2316, in invoke
return self.bound.invoke(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 84, in invoke
return self(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
response = self.executor.step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
response = self.chain.run(**inputs, callbacks=callbacks)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 506, in run
return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 1141, in _call
next_step_output = self._take_next_step(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/agent.py", line 991, in _take_next_step
observation = tool.run(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 364, in run
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/base.py", line 336, in run
self._run(*tool_args, run_manager=run_manager, **tool_kwargs)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/tools/vectorstore/tool.py", line 55, in _run
return chain.run(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 501, in run
return self(args[0], callbacks=callbacks, tags=tags, metadata=metadata)[
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/retrieval_qa/base.py", line 139, in _call
answer = self.combine_documents_chain.run(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 506, in run
return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/combine_documents/base.py", line 119, in _call
output, extra_return_dict = self.combine_docs(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/combine_documents/stuff.py", line 171, in combine_docs
return self.llm_chain.predict(callbacks=callbacks, **inputs), {}
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/llm.py", line 257, in predict
return self(kwargs, callbacks=callbacks)[self.output_key]
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 306, in __call__
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/base.py", line 300, in __call__
self._call(inputs, run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/llm.py", line 93, in _call
response = self.generate([inputs], run_manager=run_manager)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chains/llm.py", line 103, in generate
return self.llm.generate_prompt(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chat_models/base.py", line 469, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chat_models/base.py", line 359, in generate
raise e
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chat_models/base.py", line 349, in generate
self._generate_with_cache(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chat_models/base.py", line 501, in _generate_with_cache
return self._generate(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chat_models/openai.py", line 403, in _generate
response = self.completion_with_retry(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chat_models/openai.py", line 282, in completion_with_retry
return _completion_with_retry(**kwargs)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/tenacity/__init__.py", line 289, in wrapped_f
return self(f, *args, **kw)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/tenacity/__init__.py", line 379, in __call__
do = self.iter(retry_state=retry_state)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/tenacity/__init__.py", line 314, in iter
return fut.result()
File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/concurrent/futures/_base.py", line 451, in result
return self.__get_result()
File "/opt/homebrew/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
raise self._exception
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/tenacity/__init__.py", line 382, in __call__
result = fn(*args, **kwargs)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/chat_models/openai.py", line 280, in _completion_with_retry
return self.client.create(**kwargs)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
return super().create(*args, **kwargs)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 151, in create
) = cls.__prepare_create_request(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 85, in __prepare_create_request
raise error.InvalidRequestError(
openai.error.InvalidRequestError: Must provide an 'engine' or 'deployment_id' parameter to create a <class 'openai.api_resources.chat_completion.ChatCompletion'>
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: b09f4eb3-3e1d-4e47-909c-f898b86f75e0).
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: 112bfdb3-4fb3-410e-a077-a1b844ab4677).
Error in handle_issue_opened: 'NoneType' object is not subscriptable Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 110, in handle_issue_opened
ray.get(post_comment.remote(issue_or_pr=issue, text=str(result['output']), time_delay_s=0))
TypeError: 'NoneType' object is not subscriptable
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.
Error in handle_issue_opened: 1 validation error for PromptTemplate root Invalid prompt schema; check for mismatched or missing input parameters. ', \, \, n, , , \, ", a, c, t, i, o, n, \, ", ' (type=value_error) Traceback
Traceback (most recent call last):
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 98, in handle_issue_opened
bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id, image_name=image_name)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 89, in __init__
self.agent = self.make_agent()
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 148, in make_agent
executor = self.custom_load_agent_executor(self.llm, tools, verbose=True, callbacks=[self.callback_handler], handle_parsing_errors=True)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 128, in custom_load_agent_executor
agent = StructuredChatAgent.from_llm_and_tools(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/structured_chat/base.py", line 130, in from_llm_and_tools
prompt = cls.create_prompt(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/agents/structured_chat/base.py", line 109, in create_prompt
HumanMessagePromptTemplate.from_template(human_message_template),
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/prompts/chat.py", line 151, in from_template
prompt = PromptTemplate.from_template(template, template_format=template_format)
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/prompts/prompt.py", line 217, in from_template
return cls(
File "/Users/minum/Documents/NCSA/UIUC-Chatbot/ai-ta-backend/.venv310/src/langchain/libs/langchain/langchain/load/serializable.py", line 97, in __init__
super().__init__(**kwargs)
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 1 validation error for PromptTemplate
__root__
Invalid prompt schema; check for mismatched or missing input parameters. ', \\, \\, n, , , \\, ", a, c, t, i, o, n, \\, ", ' (type=value_error)
Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.
You can monitor the LangSmith trace here.
Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.