KastanDay / ML4Bio

LLMs to execute Bioinformatics workflows, esp. RNA-seq
MIT License
0 stars 0 forks source link

Create a full command line executable workflow for RNA-Seq on PBMC Samples. Open a new pull request (on a separate branch) and comment the PR number here when you're done. #10

Open rohan-uiuc opened 8 months ago

rohan-uiuc commented 8 months ago

Experiment Type: RNA-Seq Sequencing of total cellular RNA

Workflow Management: Bash/SLURM Scripting and job scheduling

Software Stack: FastQC MultiQC STAR RSEM samtools DESeq2

What else to know about the pipeline? I am working PBMC samples collected from patients that are undergoing immunotherapy.

Use the data files existing in Report_WholeBrain as input for this workflow.

You should write a series of bash scripts and R scripts that can accomplish this task. Open a PR with those scripts when you're done.

lil-jr-dev[bot] commented 8 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 8 months ago

Error in handle_issue_opened: load_agent_executor() got an unexpected keyword argument 'trim_intermediate_steps' Traceback

Traceback (most recent call last):
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 90, in handle_issue_opened
    bot = WorkflowAgent(run_id_in_metadata=langsmith_run_id)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 27, in __init__
    self.agent = self.make_agent()
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 45, in make_agent
    executor = load_agent_executor(self.llm, tools, verbose=True, trim_intermediate_steps=fancier_trim_intermediate_steps, handle_parsing_errors=True)
TypeError: load_agent_executor() got an unexpected keyword argument 'trim_intermediate_steps'
lil-jr-dev[bot] commented 8 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 8 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 8 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 8 months ago

Error in handle_issue_opened: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory')) Traceback

Traceback (most recent call last):
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/newrelic/hooks/external_urllib3.py", line 28, in _nr_wrapper_make_request_
    return ExternalTraceWrapper(wrapped, 'urllib3', url_for_apm_ui)(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/newrelic/api/external_trace.py", line 110, in literal_wrapper
    return wrapped(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 416, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/urllib3/connection.py", line 244, in request
    super(HTTPConnection, self).request(method, url, body=body, headers=headers)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1283, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1329, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/newrelic/hooks/external_httplib2.py", line 33, in _nr_wrapper_httplib2_endheaders_wrapper_inner
    return wrapped(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/newrelic/hooks/external_httplib.py", line 29, in httplib_endheaders_wrapper
    return wrapped(*args, **kwargs)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1278, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1038, in _send_output
    self.send(msg)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 976, in send
    self.connect()
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/transport/unixconn.py", line 27, in connect
    sock.connect(self.unix_socket)
FileNotFoundError: [Errno 2] No such file or directory

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 799, in urlopen
    retries = retries.increment(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/urllib3/util/retry.py", line 550, in increment
    raise six.reraise(type(error), error, _stacktrace)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/urllib3/packages/six.py", line 769, in reraise
    raise value.with_traceback(tb)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/newrelic/hooks/external_urllib3.py", line 28, in _nr_wrapper_make_request_
    return ExternalTraceWrapper(wrapped, 'urllib3', url_for_apm_ui)(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/newrelic/api/external_trace.py", line 110, in literal_wrapper
    return wrapped(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/urllib3/connectionpool.py", line 416, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/urllib3/connection.py", line 244, in request
    super(HTTPConnection, self).request(method, url, body=body, headers=headers)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1283, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1329, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/newrelic/hooks/external_httplib2.py", line 33, in _nr_wrapper_httplib2_endheaders_wrapper_inner
    return wrapped(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/newrelic/hooks/external_httplib.py", line 29, in httplib_endheaders_wrapper
    return wrapped(*args, **kwargs)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1278, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 1038, in _send_output
    self.send(msg)
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/http/client.py", line 976, in send
    self.connect()
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/transport/unixconn.py", line 27, in connect
    sock.connect(self.unix_socket)
urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/api/client.py", line 214, in _retrieve_server_version
    return self.version(api_version=False)["ApiVersion"]
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/api/daemon.py", line 181, in version
    return self._result(self._get(url), json=True)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/utils/decorators.py", line 46, in inner
    return f(self, *args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/api/client.py", line 237, in _get
    return self.get(url, **self._set_request_timeout(kwargs))
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/newrelic/api/external_trace.py", line 75, in dynamic_wrapper
    return wrapped(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/requests/adapters.py", line 501, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 91, in handle_issue_opened
    result = bot.run(prompt)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 31, in run
    result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input":f"{input}"}, {"metadata": {"run_id_in_metadata": str(self.run_id_in_metadata)}})
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/schema/runnable/base.py", line 2316, in invoke
    return self.bound.invoke(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 84, in invoke
    return self(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
    response = self.executor.step(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
    response = self.chain.run(**inputs, callbacks=callbacks)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 506, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 1141, in _call
    next_step_output = self._take_next_step(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 991, in _take_next_step
    observation = tool.run(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/tools/base.py", line 364, in run
    raise e
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/tools/base.py", line 336, in run
    self._run(*tool_args, run_manager=run_manager, **tool_kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/tools/base.py", line 515, in _run
    else self.func(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/ai_ta_backend/agents/tools.py", line 77, in execute_code_tool
    return execute_code(code, timeout, filename, work_dir, use_docker, lang)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/autogen/code_utils.py", line 339, in execute_code
    client = docker.from_env()
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/client.py", line 96, in from_env
    return cls(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/client.py", line 45, in __init__
    self.api = APIClient(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/api/client.py", line 197, in __init__
    self._version = self._retrieve_server_version()
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/docker/api/client.py", line 221, in _retrieve_server_version
    raise DockerException(
docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))
lil-jr-dev[bot] commented 8 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 8 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 8 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 8 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 8 months ago

Error in handle_issue_opened: This model's maximum context length is 8192 tokens. However, your messages resulted in 17113 tokens. Please reduce the length of the messages. Traceback

Traceback (most recent call last):
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 91, in handle_issue_opened
    result = bot.run(prompt)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 31, in run
    result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input":f"{input}"}, {"metadata": {"run_id_in_metadata": str(self.run_id_in_metadata)}})
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/schema/runnable/base.py", line 2316, in invoke
    return self.bound.invoke(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 84, in invoke
    return self(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
    response = self.executor.step(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
    response = self.chain.run(**inputs, callbacks=callbacks)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 506, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 1141, in _call
    next_step_output = self._take_next_step(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 928, in _take_next_step
    output = self.agent.plan(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/agents/agent.py", line 541, in plan
    full_output = self.llm_chain.predict(callbacks=callbacks, **full_inputs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/llm.py", line 257, in predict
    return self(kwargs, callbacks=callbacks)[self.output_key]
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/llm.py", line 93, in _call
    response = self.generate([inputs], run_manager=run_manager)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chains/llm.py", line 103, in generate
    return self.llm.generate_prompt(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chat_models/base.py", line 469, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chat_models/base.py", line 359, in generate
    raise e
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chat_models/base.py", line 349, in generate
    self._generate_with_cache(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chat_models/base.py", line 501, in _generate_with_cache
    return self._generate(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chat_models/openai.py", line 360, in _generate
    response = self.completion_with_retry(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chat_models/openai.py", line 299, in completion_with_retry
    return _completion_with_retry(**kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/tenacity/__init__.py", line 314, in iter
    return fut.result()
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/concurrent/futures/_base.py", line 451, in result
    return self.__get_result()
  File "/usr/local/Cellar/python@3.10/3.10.13/Frameworks/Python.framework/Versions/3.10/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/langchain/chat_models/openai.py", line 297, in _completion_with_retry
    return self.client.create(**kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 155, in create
    response, _, api_key = requestor.request(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/openai/api_requestor.py", line 299, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/openai/api_requestor.py", line 710, in _interpret_response
    self._interpret_response_line(
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/env3.10/lib/python3.10/site-packages/openai/api_requestor.py", line 775, in _interpret_response_line
    raise self.handle_error_response(
openai.error.InvalidRequestError: This model's maximum context length is 8192 tokens. However, your messages resulted in 17113 tokens. Please reduce the length of the messages.
lil-jr-dev[bot] commented 8 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 8 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 8 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 8 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 8 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 8 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 8 months ago

The new pull request is number 9.

lil-jr-dev[bot] commented 8 months ago

Error in handle_issue_opened: 'NoneType' object is not subscriptable Traceback

Traceback (most recent call last):
  File "/Users/rohanmarwaha/IdeaProjects/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 102, in handle_issue_opened
    ray.get(post_comment.remote(issue_or_pr=issue, text=str(result['output']), time_delay_s=0))
TypeError: 'NoneType' object is not subscriptable
lil-jr-dev[bot] commented 8 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 8 months ago

👉 Follow the bot's progress in real time on LangSmith.