KastanDay / ML4Bio

LLMs to execute Bioinformatics workflows, esp. RNA-seq
MIT License
0 stars 1 forks source link

Create a full command line executable workflow for RNA-Seq on PBMC Samples. Open a new pull request (on a separate branch) and comment the PR number here when you're done. #33

Open KastanDay opened 10 months ago

KastanDay commented 10 months ago

Experiment Type: RNA-Seq Sequencing of total cellular RNA

Workflow Management: Bash/SLURM Scripting and job scheduling

Software Stack: FastQC MultiQC STAR RSEM samtools DESeq2

What else to know about the pipeline? I am working PBMC samples collected from patients that are undergoing immunotherapy.

Use the data files existing in Report_WholeBrain as input for this workflow.

You should write a series of bash scripts and R scripts that can accomplish this task. Open a PR with those scripts when you're done.

lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 10 months ago

Error in handle_issue_opened: 1 validation error for ShellInput commands field required (type=value_error.missing) Traceback

Traceback (most recent call last):
  File "/app/github_webhook_handlers.py", line 103, in handle_issue_opened
    result = bot.run(prompt)
  File "/app/ml4bio_agent.py", line 34, in run
    result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input":f"{input}"}, {"metadata": {"run_id_in_metadata": str(self.run_id_in_metadata)}})
  File "/usr/local/lib/python3.10/dist-packages/langchain/schema/runnable/base.py", line 2316, in invoke
    return self.bound.invoke(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 84, in invoke
    return self(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
    response = self.executor.step(
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
    response = self.chain.run(**inputs, callbacks=callbacks)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 506, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/agents/agent.py", line 1141, in _call
    next_step_output = self._take_next_step(
  File "/usr/local/lib/python3.10/dist-packages/langchain/agents/agent.py", line 991, in _take_next_step
    observation = tool.run(
  File "/usr/local/lib/python3.10/dist-packages/langchain/tools/base.py", line 310, in run
    parsed_input = self._parse_input(tool_input)
  File "/usr/local/lib/python3.10/dist-packages/langchain/tools/base.py", line 245, in _parse_input
    result = input_args.parse_obj(tool_input)
  File "/usr/local/lib/python3.10/dist-packages/pydantic/main.py", line 526, in parse_obj
    return cls(**obj)
  File "/usr/local/lib/python3.10/dist-packages/pydantic/main.py", line 341, in __init__
    raise validation_error
pydantic.error_wrappers.ValidationError: 1 validation error for ShellInput
commands
  field required (type=value_error.missing)
lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Error in handle_issue_opened: This model's maximum context length is 8192 tokens. However, your messages resulted in 8202 tokens. Please reduce the length of the messages. Traceback

Traceback (most recent call last):
  File "/app/github_webhook_handlers.py", line 103, in handle_issue_opened
    result = bot.run(prompt)
  File "/app/ml4bio_agent.py", line 34, in run
    result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input":f"{input}"}, {"metadata": {"run_id_in_metadata": str(self.run_id_in_metadata)}})
  File "/usr/local/lib/python3.10/dist-packages/langchain/schema/runnable/base.py", line 2316, in invoke
    return self.bound.invoke(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 84, in invoke
    return self(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
    response = self.executor.step(
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
    response = self.chain.run(**inputs, callbacks=callbacks)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 506, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/agents/agent.py", line 1141, in _call
    next_step_output = self._take_next_step(
  File "/usr/local/lib/python3.10/dist-packages/langchain/agents/agent.py", line 928, in _take_next_step
    output = self.agent.plan(
  File "/usr/local/lib/python3.10/dist-packages/langchain/agents/agent.py", line 541, in plan
    full_output = self.llm_chain.predict(callbacks=callbacks, **full_inputs)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/llm.py", line 257, in predict
    return self(kwargs, callbacks=callbacks)[self.output_key]
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/llm.py", line 93, in _call
    response = self.generate([inputs], run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/llm.py", line 103, in generate
    return self.llm.generate_prompt(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 469, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 359, in generate
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 349, in generate
    self._generate_with_cache(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 501, in _generate_with_cache
    return self._generate(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 403, in _generate
    response = self.completion_with_retry(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 282, in completion_with_retry
    return _completion_with_retry(**kwargs)
  File "/usr/local/lib/python3.10/dist-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
  File "/usr/local/lib/python3.10/dist-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
  File "/usr/local/lib/python3.10/dist-packages/tenacity/__init__.py", line 314, in iter
    return fut.result()
  File "/usr/lib/python3.10/concurrent/futures/_base.py", line 451, in result
    return self.__get_result()
  File "/usr/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/usr/local/lib/python3.10/dist-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 280, in _completion_with_retry
    return self.client.create(**kwargs)
  File "/usr/local/lib/python3.10/dist-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/openai/api_resources/abstract/engine_api_resource.py", line 155, in create
    response, _, api_key = requestor.request(
  File "/usr/local/lib/python3.10/dist-packages/openai/api_requestor.py", line 299, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "/usr/local/lib/python3.10/dist-packages/openai/api_requestor.py", line 710, in _interpret_response
    self._interpret_response_line(
  File "/usr/local/lib/python3.10/dist-packages/openai/api_requestor.py", line 775, in _interpret_response_line
    raise self.handle_error_response(
openai.error.InvalidRequestError: This model's maximum context length is 8192 tokens. However, your messages resulted in 8202 tokens. Please reduce the length of the messages.
lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Error in handle_issue_opened: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response')) Traceback

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
  File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 467, in _make_request
    six.raise_from(e, None)
  File "<string>", line 3, in raise_from
  File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 462, in _make_request
    httplib_response = conn.getresponse()
  File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse
    response.begin()
  File "/usr/lib/python3.10/http/client.py", line 318, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.10/http/client.py", line 287, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
http.client.RemoteDisconnected: Remote end closed connection without response

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/requests/adapters.py", line 486, in send
    resp = conn.urlopen(
  File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 799, in urlopen
    retries = retries.increment(
  File "/usr/local/lib/python3.10/dist-packages/urllib3/util/retry.py", line 550, in increment
    raise six.reraise(type(error), error, _stacktrace)
  File "/usr/local/lib/python3.10/dist-packages/urllib3/packages/six.py", line 769, in reraise
    raise value.with_traceback(tb)
  File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 715, in urlopen
    httplib_response = self._make_request(
  File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 467, in _make_request
    six.raise_from(e, None)
  File "<string>", line 3, in raise_from
  File "/usr/local/lib/python3.10/dist-packages/urllib3/connectionpool.py", line 462, in _make_request
    httplib_response = conn.getresponse()
  File "/usr/lib/python3.10/http/client.py", line 1375, in getresponse
    response.begin()
  File "/usr/lib/python3.10/http/client.py", line 318, in begin
    version, status, reason = self._read_status()
  File "/usr/lib/python3.10/http/client.py", line 287, in _read_status
    raise RemoteDisconnected("Remote end closed connection without"
urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/github_webhook_handlers.py", line 103, in handle_issue_opened
    result = bot.run(prompt)
  File "/app/ml4bio_agent.py", line 34, in run
    result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input":f"{input}"}, {"metadata": {"run_id_in_metadata": str(self.run_id_in_metadata)}})
  File "/usr/local/lib/python3.10/dist-packages/langchain/schema/runnable/base.py", line 2316, in invoke
    return self.bound.invoke(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 84, in invoke
    return self(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
    response = self.executor.step(
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
    response = self.chain.run(**inputs, callbacks=callbacks)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 506, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/agents/agent.py", line 1141, in _call
    next_step_output = self._take_next_step(
  File "/usr/local/lib/python3.10/dist-packages/langchain/agents/agent.py", line 991, in _take_next_step
    observation = tool.run(
  File "/usr/local/lib/python3.10/dist-packages/langchain/tools/base.py", line 364, in run
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/tools/base.py", line 336, in run
    self._run(*tool_args, run_manager=run_manager, **tool_kwargs)
  File "/usr/local/lib/python3.10/dist-packages/langchain/tools/github/tool.py", line 32, in _run
    return self.api_wrapper.run(self.mode, instructions)
  File "/usr/local/lib/python3.10/dist-packages/langchain/utilities/github.py", line 758, in run
    return self.get_files_from_directory(query)
  File "/usr/local/lib/python3.10/dist-packages/langchain/utilities/github.py", line 309, in get_files_from_directory
    contents = self.github_repo_instance.get_contents(
  File "/usr/local/lib/python3.10/dist-packages/github/Repository.py", line 2151, in get_contents
    headers, data = self._requester.requestJsonAndCheck(
  File "/usr/local/lib/python3.10/dist-packages/github/Requester.py", line 494, in requestJsonAndCheck
    return self.__check(*self.requestJson(verb, url, parameters, headers, input, self.__customConnection(url)))
  File "/usr/local/lib/python3.10/dist-packages/github/Requester.py", line 629, in requestJson
    return self.__requestEncode(cnx, verb, url, parameters, headers, input, encode)
  File "/usr/local/lib/python3.10/dist-packages/github/Requester.py", line 726, in __requestEncode
    status, responseHeaders, output = self.__requestRaw(cnx, verb, url, requestHeaders, encoded_input)
  File "/usr/local/lib/python3.10/dist-packages/github/Requester.py", line 760, in __requestRaw
    response = cnx.getresponse()
  File "/usr/local/lib/python3.10/dist-packages/github/Requester.py", line 174, in getresponse
    r = verb(
  File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "/usr/local/lib/python3.10/dist-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/requests/adapters.py", line 501, in send
    raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))
lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Error in handle_issue_opened: module 'openai' has no attribute 'error' Traceback

Traceback (most recent call last):
  File "/app/github_webhook_handlers.py", line 103, in handle_issue_opened
    result = bot.run(prompt)
  File "/app/ml4bio_agent.py", line 34, in run
    result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input":f"{input}"}, {"metadata": {"run_id_in_metadata": str(self.run_id_in_metadata)}})
  File "/usr/local/lib/python3.10/dist-packages/langchain/schema/runnable/base.py", line 2316, in invoke
    return self.bound.invoke(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 84, in invoke
    return self(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 43, in _call
    plan = self.planner.plan(
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/planners/base.py", line 37, in plan
    llm_response = self.llm_chain.run(**inputs, stop=self.stop, callbacks=callbacks)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 506, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/llm.py", line 93, in _call
    response = self.generate([inputs], run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/llm.py", line 103, in generate
    return self.llm.generate_prompt(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 469, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 359, in generate
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 349, in generate
    self._generate_with_cache(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 501, in _generate_with_cache
    return self._generate(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 403, in _generate
    response = self.completion_with_retry(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 276, in completion_with_retry
    retry_decorator = _create_retry_decorator(self, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 56, in _create_retry_decorator
    openai.error.Timeout,
AttributeError: module 'openai' has no attribute 'error'
lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

👉 Follow the bot's progress in real time on LangSmith.

lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Error in handle_issue_opened: GitHubAction._run() got an unexpected keyword argument 'directory_path' Traceback

Traceback (most recent call last):
  File "/app/github_webhook_handlers.py", line 103, in handle_issue_opened
    result = bot.run(prompt)
  File "/app/ml4bio_agent.py", line 34, in run
    result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input":f"{input}"}, {"metadata": {"run_id_in_metadata": str(self.run_id_in_metadata)}})
  File "/usr/local/lib/python3.10/dist-packages/langchain/schema/runnable/base.py", line 2316, in invoke
    return self.bound.invoke(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 84, in invoke
    return self(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 56, in _call
    response = self.executor.step(
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/executors/base.py", line 37, in step
    response = self.chain.run(**inputs, callbacks=callbacks)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 506, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/agents/agent.py", line 1141, in _call
    next_step_output = self._take_next_step(
  File "/usr/local/lib/python3.10/dist-packages/langchain/agents/agent.py", line 991, in _take_next_step
    observation = tool.run(
  File "/usr/local/lib/python3.10/dist-packages/langchain/tools/base.py", line 364, in run
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/tools/base.py", line 336, in run
    self._run(*tool_args, run_manager=run_manager, **tool_kwargs)
TypeError: GitHubAction._run() got an unexpected keyword argument 'directory_path'
lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Error in handle_issue_opened: module 'openai' has no attribute 'error' Traceback

Traceback (most recent call last):
  File "/app/github_webhook_handlers.py", line 103, in handle_issue_opened
    result = bot.run(prompt)
  File "/app/ml4bio_agent.py", line 34, in run
    result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input":f"{input}"}, {"metadata": {"run_id_in_metadata": str(self.run_id_in_metadata)}})
  File "/usr/local/lib/python3.10/dist-packages/langchain/schema/runnable/base.py", line 2316, in invoke
    return self.bound.invoke(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 84, in invoke
    return self(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 43, in _call
    plan = self.planner.plan(
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/planners/base.py", line 37, in plan
    llm_response = self.llm_chain.run(**inputs, stop=self.stop, callbacks=callbacks)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 506, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/llm.py", line 93, in _call
    response = self.generate([inputs], run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/llm.py", line 103, in generate
    return self.llm.generate_prompt(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 469, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 359, in generate
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 349, in generate
    self._generate_with_cache(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 501, in _generate_with_cache
    return self._generate(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 403, in _generate
    response = self.completion_with_retry(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 276, in completion_with_retry
    retry_decorator = _create_retry_decorator(self, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 56, in _create_retry_decorator
    openai.error.Timeout,
AttributeError: module 'openai' has no attribute 'error'
lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Error in handle_issue_opened: module 'openai' has no attribute 'error' Traceback

Traceback (most recent call last):
  File "/app/github_webhook_handlers.py", line 106, in handle_issue_opened
    result = bot.run(prompt)
  File "/app/ml4bio_agent.py", line 91, in run
    result = self.agent.with_config({"run_name": "ML4BIO Plan & Execute Agent"}).invoke({"input": f"{input}"}, {
  File "/usr/local/lib/python3.10/dist-packages/langchain/schema/runnable/base.py", line 2316, in invoke
    return self.bound.invoke(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 84, in invoke
    return self(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/agent_executor.py", line 43, in _call
    plan = self.planner.plan(
  File "/usr/local/lib/python3.10/dist-packages/langchain_experimental/plan_and_execute/planners/base.py", line 37, in plan
    llm_response = self.llm_chain.run(**inputs, stop=self.stop, callbacks=callbacks)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 506, in run
    return self(kwargs, callbacks=callbacks, tags=tags, metadata=metadata)[
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 306, in __call__
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/base.py", line 300, in __call__
    self._call(inputs, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/llm.py", line 93, in _call
    response = self.generate([inputs], run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chains/llm.py", line 103, in generate
    return self.llm.generate_prompt(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 469, in generate_prompt
    return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 359, in generate
    raise e
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 349, in generate
    self._generate_with_cache(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/base.py", line 501, in _generate_with_cache
    return self._generate(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 403, in _generate
    response = self.completion_with_retry(
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 276, in completion_with_retry
    retry_decorator = _create_retry_decorator(self, run_manager=run_manager)
  File "/usr/local/lib/python3.10/dist-packages/langchain/chat_models/openai.py", line 56, in _create_retry_decorator
    openai.error.Timeout,
AttributeError: module 'openai' has no attribute 'error'
lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Error in handle_issue_opened: 'WorkflowAgent' object has no attribute 'langsmith_run_id' Traceback

Traceback (most recent call last):
  File "/Users/kastanday/code/ncsa/ai-ta/ai-ta-backend/ai_ta_backend/agents/github_webhook_handlers.py", line 121, in handle_issue_opened
    bot = WorkflowAgent(langsmith_run_id=langsmith_run_id)
  File "/Users/kastanday/code/ncsa/ai-ta/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 30, in __init__
    self.agent = self.make_agent()
  File "/Users/kastanday/code/ncsa/ai-ta/ai-ta-backend/ai_ta_backend/agents/ml4bio_agent.py", line 41, in make_agent
    tools = get_tools(langsmith_run_id=self.langsmith_run_id)
AttributeError: 'WorkflowAgent' object has no attribute 'langsmith_run_id'
lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 10 months ago

Thanks for opening a new issue! I'll now try to finish this implementation and open a PR for you to review.

You can monitor the LangSmith trace here.

Feel free to comment in this thread to give me additional instructions, or I'll tag you in a comment if I get stuck. If I think I'm successful I'll 'request your review' on the resulting PR. Just watch for emails while I work.

lil-jr-dev[bot] commented 8 months ago

👉 [Follow the bot's progress in real time on LangSmith](Failed to generate sharable URL, cannot find this run on LangSmith. RunID: 442c0192-95d3-4891-9159-74e37627af28).