Codium-ai / pr-agent

🚀CodiumAI PR-Agent: An AI-Powered 🤖 Tool for Automated Pull Request Analysis, Feedback, Suggestions and More! 💻🔍
Apache License 2.0
5.68k stars 525 forks source link

[GITLAB] Installation Guide for GitLab please? #171

Closed pushpanktugnawat closed 12 months ago

pushpanktugnawat commented 1 year ago

Context:

I am trying to follow the installation guide for Gitlab but I am facing some weird errors with [Docker] I face GitHub token is required when using user deployment. but When I use it with my own installation on my machine with MacBook (M1) it causes me the error as pr-agent not found. I followed the installation guide but no luck no success. A handful documents with required information is highly appreciated.

okotek commented 1 year ago

In configuration.toml, set [config]->git_provider to "gitlab". In .secrets.toml, set [gitlab] -> personal_access_token to your token.

If it still doesn't work, do you mind sharing your configuration files (without keys)?

pushpanktugnawat commented 1 year ago

@okotek Thanks for your response. Let me share as still it fails.

I am trying all different ways as mentioned in the Installation Guide.

  1. When I run it as a docker command:
    
    docker run --rm -it -e OPENAI.KEY={token} -e GITLAB.PERSONAL_ACCESS_TOKEN={token} codiumai/pr-agent --git_provider gitlab --pr_url {url} describe

WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested usage: Usage: cli.py --pr-url []. For example:

Supported commands: review / review_pr - Add a review that includes a summary of the PR and specific suggestions for improvement. ask / ask_question [question] - Ask a question about the PR. describe / describe_pr - Modify the PR title and description based on the PR's contents. improve / improve_code - Suggest improvements to the code in the PR as pull request comments ready to commit. reflect - Ask the PR author questions about the PR. update_changelog - Update the changelog based on the PR's contents.

To edit any configuration parameter from 'configuration.toml', just add -config_path=. For example: '- cli.py --pr-url=... review --pr_reviewer.extra_instructions="focus on the file: ..."' cli.py: error: argument command: invalid choice: 'gitlab' (choose from 'answer', 'review', 'review_pr', 'reflect', 'reflect_and_review', 'describe', 'describe_pr', 'improve', 'improve_code', 'ask', 'ask_question', 'update_changelog')

OR When I run it in the below way

docker run --rm -it -e OPENAI.KEY={token} -e GITLAB.PERSONAL_ACCESS_TOKEN={token} config.git_provider="gitlab" codiumai/pr-agent --pr_url {url} describe

PROBLEM Faced: docker: invalid reference format. See 'docker run --help'.


2. When I install it into my GitLab project in the mentioned way still it doesn't work as I don't find anything in my PRs anything with Codium

path: .gitlab/workflows/pr_agent.yml or .github/workflows/pr_agent.yml pull_request: issue_comment: jobs: pr_agent_job: runs-on: ubuntu-latest name: Run pr agent on every pull request, respond to user comments steps:

Your early response is appreciated or please guide me if I am not following it in the right way.

okotek commented 1 year ago

Hi @pushpanktugnawat

For the first method, change your command to:

docker run --rm -it -e OPENAI.KEY={token} -e GITLAB.PERSONAL_ACCESS_TOKEN={token} -e CONFIG.GIT_PROVIDER=gitlab codiumai/pr-agent --pr_url {url} describe

The second way you mentioned is a GitHub workflow, it can't be used as-is in Gitlab. Supporting Gitlab CI/CD is a worthy and not too big an effort, but is not implemented yet. It's possible to install a webhook for Gitlab repositories, and use gitlab_webhook.py to serve it, as long as you are able to host such webhooks, for example using a cloud function or on a web server.

Regarding your third try, set the PYTHONPATH environment variable to the repository root.

pushpanktugnawat commented 1 year ago

Thanks @okotek the first one works now. I missed the -e for config such behavior. Second way: I will try to see how this webhook could use and try to work on it Do we already have some guide over it for quick action for me? Third way: I set the PYTHONPATH as it was already a part of the Installation guide but still the above error happens to me.

export PYTHONPATH=[$PYTHONPATH:]<PATH to pr_agent folder>
python pr_agent/cli.py --pr_url <pr_url> review
python pr_agent/cli.py --pr_url <pr_url> ask <your question>
python pr_agent/cli.py --pr_url <pr_url> describe
python pr_agent/cli.py --pr_url <pr_url> improve

Error snippet regarding the same as mentioned

echo $PYTHONPATH
[:]{rootpath}/pr-agent
python3 pr_agent/cli.py --pr_url {url} review
Traceback (most recent call last):
  File "{rootpath}/pr-agent/pr_agent/cli.py", line 6, in <module>
    from pr_agent.agent.pr_agent import PRAgent, commands
ModuleNotFoundError: No module named 'pr_agent'
pushpanktugnawat commented 1 year ago

@okotek @marshally One more thing I have used the first approach in my GitLab pipeline to configure as a docker image and run it with it. I am facing some weird problems from pr_agent as described below during my pipeline job:

docker run --rm -e OPENAI.KEY=${OPEN_API_KEY} -e GITLAB.PERSONAL_ACCESS_TOKEN=${GITLAB_PAT} -e CONFIG.GIT_PROVIDER=gitlab codiumai/pr-agent --pr_url ${MR_URL} review

Status: Downloaded newer image for codiumai/pr-agent:latest
INFO:root:Reviewing PR...
INFO:root:Getting PR diff...
INFO:root:Getting AI prediction...
--- Logging error ---
Traceback (most recent call last):
  File "/app/pr_agent/algo/ai_handler.py", line 60, in chat_completion
    response = await openai.ChatCompletion.acreate(
  File "/usr/local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 45, in acreate
    return await super().acreate(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 217, in acreate
    response, _, api_key = await requestor.arequest(
  File "/usr/local/lib/python3.10/site-packages/openai/api_requestor.py", line 372, in arequest
    result = await self.arequest_raw(
  File "/usr/local/lib/python3.10/site-packages/openai/api_requestor.py", line 668, in arequest_raw
    result = await session.request(**request_kwargs)
  File "/usr/local/lib/python3.10/site-packages/aiohttp/client.py", line 536, in _request
    conn = await self._connector.connect(
  File "/usr/local/lib/python3.10/site-packages/aiohttp/connector.py", line 540, in connect
    proto = await self._create_connection(req, traces, timeout)
  File "/usr/local/lib/python3.10/site-packages/aiohttp/connector.py", line 901, in _create_connection
    _, proto = await self._create_direct_connection(req, traces, timeout)
  File "/usr/local/lib/python3.10/site-packages/aiohttp/connector.py", line 1155, in _create_direct_connection
    hosts = await asyncio.shield(host_resolved)
  File "/usr/local/lib/python3.10/site-packages/aiohttp/connector.py", line 874, in _resolve_host
    addrs = await self._resolver.resolve(host, port, family=self._family)
  File "/usr/local/lib/python3.10/site-packages/aiohttp/resolver.py", line 33, in resolve
    infos = await self._loop.getaddrinfo(
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 863, in getaddrinfo
    return await self.run_in_executor(
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 821, in run_in_executor
    executor.submit(func, *args), loop=self)
  File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 176, in submit
    self._adjust_thread_count()
  File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 199, in _adjust_thread_count
    t.start()
  File "/usr/local/lib/python3.10/threading.py", line 935, in start
    _start_new_thread(self._bootstrap, ())
RuntimeError: can't start new thread
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/logging/__init__.py", line [110](https://gitlab.com/container-xchange/team-inventory/xchange-public-apis/-/jobs/4807381004#L110)0, in emit
    msg = self.format(record)
  File "/usr/local/lib/python3.10/logging/__init__.py", line 943, in format
    return fmt.format(record)
  File "/usr/local/lib/python3.10/logging/__init__.py", line 678, in format
    record.message = record.getMessage()
  File "/usr/local/lib/python3.10/logging/__init__.py", line 368, in getMessage
    msg = msg % self.args
TypeError: not all arguments converted during string formatting
Call stack:
  File "/app/pr_agent/cli.py", line 45, in <module>
    run()
  File "/app/pr_agent/cli.py", line 39, in run
    result = asyncio.run(PRAgent().handle_request(args.pr_url, command + " " + " ".join(args.rest)))
  File "/usr/local/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 636, in run_until_complete
    self.run_forever()
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 603, in run_forever
    self._run_once()
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 1909, in _run_once
    handle._run()
  File "/usr/local/lib/python3.10/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/app/pr_agent/agent/pr_agent.py", line 69, in handle_request
    await command2class[action](pr_url, args=args).run()
  File "/app/pr_agent/tools/pr_reviewer.py", line 97, in run
    await retry_with_fallback_models(self._prepare_prediction)
  File "/app/pr_agent/algo/pr_processing.py", line 222, in retry_with_fallback_models
    return await f(model)
  File "/app/pr_agent/tools/pr_reviewer.py", line [124](https://gitlab.com/container-xchange/team-inventory/xchange-public-apis/-/jobs/4807381004#L124), in _prepare_prediction
    self.prediction = await self._get_prediction(model)
  File "/app/pr_agent/tools/pr_reviewer.py", line [147](https://gitlab.com/container-xchange/team-inventory/xchange-public-apis/-/jobs/4807381004#L147), in _get_prediction
    response, finish_reason = await self.ai_handler.chat_completion(
  File "/app/pr_agent/algo/ai_handler.py", line 76, in chat_completion
    logging.error("Unknown error during OpenAI inference: ", e)
Message: 'Unknown error during OpenAI inference: '
Arguments: (RuntimeError("can't start new thread"),)
WARNING:root:Failed to generate prediction with gpt-4: <empty message>
INFO:root:Getting PR diff...

But when I run this command on my terminal it gives me 200 and does the job for me.

docker run --rm -it -e OPENAI.KEY={token} -e GITLAB.PERSONAL_ACCESS_TOKEN={token} -e CONFIG.GIT_PROVIDER=gitlab codiumai/pr-agent --pr_url {url} review
WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested
INFO:root:Reviewing PR...
INFO:root:Getting PR diff...
INFO:root:Getting AI prediction...
INFO:openai:message='OpenAI API response' path=https://api.openai.com/v1/chat/completions processing_ms=23825 request_id=8bff7a24bffe5390b3c829be7942f1ad response_code=200
INFO:root:Preparing PR review...
INFO:root:Pushing PR review...
INFO:root:Pushing inline code comments...

Can you help me with it?

okotek commented 1 year ago

Regarding the third way, set PYTHONPATH to point to the parent of the pr_agent folder, and import pr_agent will work.

I'm curious about the behavior you see using gitlab runner. Can it be reproduced in a public repo?

pushpanktugnawat commented 1 year ago

@okotek Ideally yes you may able to reproduce it using public-repo. I am using in my organization so couldn't provide you a sample repo but here is what my pipeline configuration looks like.

bot-review:
  stage: test
  needs:
    - job: build-jar
      artifacts: true
  variables:
    MR_URL: ${CI_MERGE_REQUEST_PROJECT_URL}/-/merge_requests/${CI_MERGE_REQUEST_IID}
  image: docker:latest
  services:
    - docker:19-dind
  script:
    - docker run --rm -e OPENAI.KEY=${OPEN_API_KEY} -e GITLAB.PERSONAL_ACCESS_TOKEN=${GITLAB_PAT} -e CONFIG.GIT_PROVIDER=gitlab codiumai/pr-agent --pr_url ${MR_URL} describe
  rules:
    - if: $CI_COMMIT_BRANCH != $CI_DEFAULT_BRANCH

And resetting PYTHONPATH as mentioned above didn't help me. I am sorry if I am asking you a very basic question as I am a JAVA developer and never worked with python in my life.

Screenshot 2023-08-06 at 8 22 22 PM
okotek commented 1 year ago

@pushpanktugnawat set the PYTHONPATH to the parent of that folder (only one pr_agent). I'll try to reproduce on Gitlab and let you know what I think 🙏

pushpanktugnawat commented 1 year ago

Thanks, @okotek. Please let me know your findings in the meantime I also try to update it as per your suggestion in my local.

okotek commented 1 year ago

Thanks It will take a few days on my end

pushpanktugnawat commented 1 year ago

Thanks, @okotek do you have any updates on it? Like I just wanted to make sure I am not waiting for very long on this.

okotek commented 1 year ago

@pushpanktugnawat I was able to reproduce, still looking for a solution

okotek commented 1 year ago

@pushpanktugnawat After some research, I found out that there is a compatibility problem with the docker services you are using. Please change in your pipeline to:

image: docker:stable services:

and report back if this solved your problem. I was able to successfully run after these changes.

pushpanktugnawat commented 1 year ago

Thanks @okotek I verify and comeback to you as early as possible.

pushpanktugnawat commented 1 year ago
okotek commented 1 year ago

You can configure these options, check the configuration file

pushpanktugnawat commented 1 year ago

@okotek Thanks a lot for looking into it and helping me to configure it, it does work now as you mentioned above but sometimes I can see the below exception causing the failure in my pipeline. What could be the reason behind it? Is this something I miss configuring or it's a tool problem?

docker run --rm -it -e OPENAI.KEY={} -e GITLAB.PERSONAL_ACCESS_TOKEN={} -e config.git_provider="gitlab" -e PR_REVIEWER.INLINE_CODE_COMMENTS=true codiumai/pr-agent --pr_url {} review

error: 500: 500 Internal Server Error
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 336, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/mixins.py", line 300, in create
    server_data = self.gitlab.http_post(path, post_data=data, files=files, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 1021, in http_post
    result = self.http_request(
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 794, in http_request
    raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 500: 500 Internal Server Error

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 174, in publish_code_suggestions
    self.send_inline_comment(body, edit_type, found, relevant_file, relevant_line_in_file, source_line_no,
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 143, in send_inline_comment
    self.mr.discussions.create({'body': body,
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 338, in wrapped_f
    raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabCreateError: 500: 500 Internal Server Error
error: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 336, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/mixins.py", line 300, in create
    server_data = self.gitlab.http_post(path, post_data=data, files=files, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 1021, in http_post
    result = self.http_request(
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 794, in http_request
    raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 174, in publish_code_suggestions
    self.send_inline_comment(body, edit_type, found, relevant_file, relevant_line_in_file, source_line_no,
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 143, in send_inline_comment
    self.mr.discussions.create({'body': body,
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 338, in wrapped_f
    raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabCreateError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
okotek commented 1 year ago

It looks like a bug. Let's try to find it. Is it possible to add --e CONFIG.VERBOSITY_LEVEL=2 and try to reproduce? I would like to see the generated output, and see what went wrong.

pushpanktugnawat commented 1 year ago

@okotek Here are the verbose logs. Just an FYI for you I receive this error sometimes with /review and sometimes with /improve endpoint.

ERROR:root:Could not publish code suggestion:
suggestion: {'body': '**Suggestion:** sometext', 'relevant_lines_start': 24, 'relevant_lines_end': 25}
error: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 336, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/mixins.py", line 300, in create
    server_data = self.gitlab.http_post(path, post_data=data, files=files, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 1021, in http_post
    result = self.http_request(
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 794, in http_request
    raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 174, in publish_code_suggestions
    self.send_inline_comment(body, edit_type, found, relevant_file, relevant_line_in_file, source_line_no,
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 143, in send_inline_comment
    self.mr.discussions.create({'body': body,
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 338, in wrapped_f
    raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabCreateError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
ERROR:root:Could not publish code suggestion:
suggestion: {'body': '**Suggestion:** sometest```', 'relevant_file': 'filepath', 'relevant_lines_start': 32, 'relevant_lines_end': 34}
error: 500: 500 Internal Server Error
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 336, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/mixins.py", line 300, in create
    server_data = self.gitlab.http_post(path, post_data=data, files=files, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 1021, in http_post
    result = self.http_request(
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 794, in http_request
    raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 500: 500 Internal Server Error

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 174, in publish_code_suggestions
    self.send_inline_comment(body, edit_type, found, relevant_file, relevant_line_in_file, source_line_no,
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 143, in send_inline_comment
    self.mr.discussions.create({'body': body,
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 338, in wrapped_f
    raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabCreateError: 500: 500 Internal Server Error
ERROR:root:Could not publish code suggestion:
suggestion: {'body': "**Suggestion:** sometext```", 'relevant_file': 'filepath, 'relevant_lines_start': 64, 'relevant_lines_end': 65}
error: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 336, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/mixins.py", line 300, in create
    server_data = self.gitlab.http_post(path, post_data=data, files=files, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 1021, in http_post
    result = self.http_request(
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 794, in http_request
    raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 174, in publish_code_suggestions
    self.send_inline_comment(body, edit_type, found, relevant_file, relevant_line_in_file, source_line_no,
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 143, in send_inline_comment
    self.mr.discussions.create({'body': body,
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 338, in wrapped_f
    raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabCreateError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
ERROR:root:Could not publish code suggestion:
suggestion: {'body': '**Suggestion:** some text', 'relevant_file': 'filname', 'relevant_lines_start': 16, 'relevant_lines_end': 22}
error: 500: 500 Internal Server Error
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 336, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/mixins.py", line 300, in create
    server_data = self.gitlab.http_post(path, post_data=data, files=files, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 1021, in http_post
    result = self.http_request(
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 794, in http_request
    raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 500: 500 Internal Server Error

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 174, in publish_code_suggestions
    self.send_inline_comment(body, edit_type, found, relevant_file, relevant_line_in_file, source_line_no,
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 143, in send_inline_comment
    self.mr.discussions.create({'body': body,
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 338, in wrapped_f
    raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabCreateError: 500: 500 Internal Server Error
pushpanktugnawat commented 1 year ago

@okotek I am also confused as most of the time the tool doesn't generate a single line of PR-Review comments and I am a bit confused about that are we following best-in-class practices or if something is wrong with the tool.

Screenshot 2023-08-15 at 10 57 09 AM
okotek commented 1 year ago

I'm unable to reproduce it.. it would really help to create a minimal failing example on some public gitlab repo

pushpanktugnawat commented 1 year ago

@okotek which one do you mean? The errors mentioned above or the unable to generation of review comments?

I am sure for /improve you can see this error without a problem its coming from me with a docker command as well over the terminal.

okotek commented 1 year ago

Couldn't reproduce neither one. I ran on this MR: https://gitlab.com/codiumai/pr-agent/-/merge_requests/2 and review comments are generated, unless I don't understand what you mean

pushpanktugnawat commented 1 year ago

Let me build a public project and then maybe I can better explain to you what I mean.

idavidov commented 1 year ago

ignore my previous comment, i able to reproduce it on public repo use my repo with very simple file and code , i have useless comment and typo in print it detects both but not able to publish, i would try to find root cause, but posting also command which anyone can reproduce

docker run --rm -it -e CONFIG.VERBOSITY_LEVEL=2 -e OPENAI.KEY={yourkey} -e GITLAB.PERSONAL_ACCESS_TOKEN=glpat-bqi2PajE4Vivzjx5mSns -e CONFIG.GIT_PROVIDER=gitlab codiumai/pr-agent --pr_url https://gitlab.com/idavidov1/pr_testing/-/merge_requests/1 improve

pushpanktugnawat commented 1 year ago

Thanks a lot, @idavidov else I was planning to create a public-repo to do the same. You saved me :D Please do let me know if you seek any help. @okotek fyi.

idavidov commented 1 year ago

@pushpanktugnawat in your case it make expected comments/reviews and you get errors or just errors and nothing happens in gitlab? i want assure i did accurate reproduce

pushpanktugnawat commented 1 year ago

@idavidov in my case for /improve endpoint I received the error as mentioned above And for the /review endpoint I never receive any feedback it always says MR looks good so I am curious as well Here is this a problem or we are writing best in class code!!!!

I also mentioned that above with a screenshot.

I think you are able to reproduce it in the same way.

idavidov commented 1 year ago

looks like i found root cause and working on fix, it most likely cause also your issue, but i need to test it after my fix to be on safe side

idavidov commented 1 year ago

219 fixed what i managed to reproduce, in some circumstances we not attach comments to correct diff

@pushpanktugnawat i would like your testing also on this as not 100% i reproduced your issue fully or found another problem caused by same, so it may fix your issue or may not. i would ask you to test and let me know

okotek commented 1 year ago

Looks like the fix is valid, thanks! I merged it

pushpanktugnawat commented 1 year ago

I am checking it now @idavidov @okotek Sorry for not commenting back.

pushpanktugnawat commented 1 year ago

Guys @idavidov @okotek I think the /improve endpoint still fails with the below error.

I am just executing the below command.

docker run --rm -e OPENAI.KEY=${OPEN_API_KEY} -e GITLAB.PERSONAL_ACCESS_TOKEN=${GITLAB_PAT} -e CONFIG.GIT_PROVIDER=gitlab -e PR_REVIEWER.INLINE_CODE_COMMENTS=TRUE codiumai/pr-agent --pr_url ${MR_URL} improve
ERROR:root:Could not publish code suggestion:
suggestion: {'body': '**Suggestion:** The order of fields in a class should follow the Java convention, which is `public`, `protected`, `package` (no access modifier), and `private`. But the `private final static int API_TOKEN_INDEX = 0;` is after `private final MetricsService metricsService;`.\n```suggestion\n@RequiredArgsConstructor\npublic class ApiEventsMetricsConfig {\n\n    private final MetricsService metricsService;\n    private final static int API_TOKEN_INDEX = 0;\n\n    @Around("@annotation(metricsConfigAnnotation)")\n    public Object addMetric(ProceedingJoinPoint pjp, MetricsConfigAnnotation metricsConfigAnnotation) throws Throwable {\n```', 'relevant_file': 'filename.java', 'relevant_lines_start': 14, 'relevant_lines_end': 21}
error: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 336, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/mixins.py", line 300, in create
    server_data = self.gitlab.http_post(path, post_data=data, files=files, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 1021, in http_post
    result = self.http_request(
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 794, in http_request
    raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 226, in publish_code_suggestions
    self.send_inline_comment(body, edit_type, found, relevant_file, relevant_line_in_file, source_line_no,
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 178, in send_inline_comment
    self.mr.discussions.create({'body': body,
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 338, in wrapped_f
    raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabCreateError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
ERROR:root:Could not publish code suggestion:
suggestion: {'body': "**Suggestion:** The import statements should be ordered alphabetically to make the code easier to read. It's a common convention in Java and many IDEs have a feature to do this automatically.\n```suggestion\npackage .domain.model;\n\nimport com.fasterxml.jackson.annotation.JsonIgnore;\nimport jakarta.persistence.Column;\nimport jakarta.persistence.EntityListeners;\nimport jakarta.persistence.MappedSuperclass;\nimport jakarta.validation.constraints.NotNull;\nimport lombok.Getter;\nimport lombok.Setter;\nimport org.springframework.data.annotation.CreatedBy;\nimport java.time.LocalDateTime;\n```", 'relevant_file': 'filename.java', 'relevant_lines_start': 1, 'relevant_lines_end': 11}
error: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 336, in wrapped_f
    return f(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/mixins.py", line 300, in create
    server_data = self.gitlab.http_post(path, post_data=data, files=files, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 1021, in http_post
    result = self.http_request(
  File "/usr/local/lib/python3.10/site-packages/gitlab/client.py", line 794, in http_request
    raise gitlab.exceptions.GitlabHttpError(
gitlab.exceptions.GitlabHttpError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 226, in publish_code_suggestions
    self.send_inline_comment(body, edit_type, found, relevant_file, relevant_line_in_file, source_line_no,
  File "/app/pr_agent/git_providers/gitlab_provider.py", line 178, in send_inline_comment
    self.mr.discussions.create({'body': body,
  File "/usr/local/lib/python3.10/site-packages/gitlab/exceptions.py", line 338, in wrapped_f
    raise error(e.error_message, e.response_code, e.response_body) from e
gitlab.exceptions.GitlabCreateError: 400: 400 Bad request - Note {:line_code=>["can't be blank", "must be a valid line code"]}
pushpanktugnawat commented 1 year ago

Adding public-repo if it's much needed then maybe I need some time to create a public repo with messy code and send you the link as well but currently it's not available.

idavidov commented 1 year ago

@pushpanktugnawat i had suspect your issue wasn't fully reproduced could you please add -e CONFIG.VERBOSITY_LEVEL=2 to docket command and send me verbose log?

pushpanktugnawat commented 1 year ago

Surely, @idavidov I am Sorry for being late but I feel only /improve endpoint only really needs improvement in itself as it fails again and again mostly with the error I mentioned above. But Today I see a new one.

Response (should be a valid JSON, and nothing else):
```json
--- Logging error ---
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/litellm/timeout.py", line 44, in wrapper
    result = future.result(timeout=local_timeout_duration)
  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 460, in result
    raise TimeoutError()
concurrent.futures._base.TimeoutError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/pr_agent/algo/ai_handler.py", line 85, in chat_completion
    response = await acompletion(
  File "/usr/local/lib/python3.10/site-packages/litellm/main.py", line 34, in acompletion
    return await loop.run_in_executor(None, func)
  File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 152, in wrapper
    raise e
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 141, in wrapper
    result = original_function(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/litellm/timeout.py", line 47, in wrapper
    raise exception_to_raise(f"A timeout error occurred. The function call took longer than {local_timeout_duration} second(s).")
openai.error.Timeout: A timeout error occurred. The function call took longer than 180 second(s).

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/logging/__init__.py", line 1100, in emit
    msg = self.format(record)
  File "/usr/local/lib/python3.10/logging/__init__.py", line 943, in format
    return fmt.format(record)
  File "/usr/local/lib/python3.10/logging/__init__.py", line 678, in format
    record.message = record.getMessage()
  File "/usr/local/lib/python3.10/logging/__init__.py", line 368, in getMessage
    msg = msg % self.args
TypeError: not all arguments converted during string formatting
Call stack:
  File "/app/pr_agent/cli.py", line 45, in <module>
    run()
  File "/app/pr_agent/cli.py", line 39, in run
    result = asyncio.run(PRAgent().handle_request(args.pr_url, command + " " + " ".join(args.rest)))
  File "/usr/local/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 636, in run_until_complete
    self.run_forever()
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 603, in run_forever
    self._run_once()
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 1909, in _run_once
    handle._run()
  File "/usr/local/lib/python3.10/asyncio/events.py", line 80, in _run
    self._context.run(self._callback, *self._args)
  File "/app/pr_agent/agent/pr_agent.py", line 79, in handle_request
    await command2class[action](pr_url, args=args).run()
  File "/app/pr_agent/tools/pr_code_suggestions.py", line 50, in run
    await retry_with_fallback_models(self._prepare_prediction)
  File "/app/pr_agent/algo/pr_processing.py", line 217, in retry_with_fallback_models
    return await f(model)
  File "/app/pr_agent/tools/pr_code_suggestions.py", line 68, in _prepare_prediction
    self.prediction = await self._get_prediction(model)
  File "/app/pr_agent/tools/pr_code_suggestions.py", line 79, in _get_prediction
    response, finish_reason = await self.ai_handler.chat_completion(model=model, temperature=0.2,
  File "/app/pr_agent/algo/ai_handler.py", line 97, in chat_completion
    logging.error("Error during OpenAI inference: ", e)
Message: 'Error during OpenAI inference: '
Arguments: (Timeout(message='A timeout error occurred. The function call took longer than 180 second(s).', http_status=None, request_id=None),)
WARNING:root:Failed to generate prediction with gpt-3.5-turbo-16k: Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/litellm/timeout.py", line 44, in wrapper
    result = future.result(timeout=local_timeout_duration)
  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 460, in result
    raise TimeoutError()
concurrent.futures._base.TimeoutError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/pr_agent/algo/pr_processing.py", line 217, in retry_with_fallback_models
    return await f(model)
  File "/app/pr_agent/tools/pr_code_suggestions.py", line 68, in _prepare_prediction
    self.prediction = await self._get_prediction(model)
  File "/app/pr_agent/tools/pr_code_suggestions.py", line 79, in _get_prediction
    response, finish_reason = await self.ai_handler.chat_completion(model=model, temperature=0.2,
  File "/app/pr_agent/algo/ai_handler.py", line 85, in chat_completion
    response = await acompletion(
  File "/usr/local/lib/python3.10/site-packages/litellm/main.py", line 34, in acompletion
    return await loop.run_in_executor(None, func)
  File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 152, in wrapper
    raise e
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 141, in wrapper
    result = original_function(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/litellm/timeout.py", line 47, in wrapper
    raise exception_to_raise(f"A timeout error occurred. The function call took longer than {local_timeout_duration} second(s).")
openai.error.Timeout: A timeout error occurred. The function call took longer than 180 second(s).

Traceback (most recent call last):
  File "/usr/local/lib/python3.10/site-packages/litellm/timeout.py", line 44, in wrapper
    result = future.result(timeout=local_timeout_duration)
  File "/usr/local/lib/python3.10/concurrent/futures/_base.py", line 460, in result
    raise TimeoutError()
concurrent.futures._base.TimeoutError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/app/pr_agent/cli.py", line 45, in <module>
    run()
  File "/app/pr_agent/cli.py", line 39, in run
    result = asyncio.run(PRAgent().handle_request(args.pr_url, command + " " + " ".join(args.rest)))
  File "/usr/local/lib/python3.10/asyncio/runners.py", line 44, in run
    return loop.run_until_complete(main)
  File "/usr/local/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
    return future.result()
  File "/app/pr_agent/agent/pr_agent.py", line 79, in handle_request
    await command2class[action](pr_url, args=args).run()
  File "/app/pr_agent/tools/pr_code_suggestions.py", line 50, in run
    await retry_with_fallback_models(self._prepare_prediction)
  File "/app/pr_agent/algo/pr_processing.py", line 217, in retry_with_fallback_models
    return await f(model)
  File "/app/pr_agent/tools/pr_code_suggestions.py", line 68, in _prepare_prediction
    self.prediction = await self._get_prediction(model)
  File "/app/pr_agent/tools/pr_code_suggestions.py", line 79, in _get_prediction
    response, finish_reason = await self.ai_handler.chat_completion(model=model, temperature=0.2,
  File "/app/pr_agent/algo/ai_handler.py", line 85, in chat_completion
    response = await acompletion(
  File "/usr/local/lib/python3.10/site-packages/litellm/main.py", line 34, in acompletion
    return await loop.run_in_executor(None, func)
  File "/usr/local/lib/python3.10/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 152, in wrapper
    raise e
  File "/usr/local/lib/python3.10/site-packages/litellm/utils.py", line 141, in wrapper
    result = original_function(*args, **kwargs)
  File "/usr/local/lib/python3.10/site-packages/litellm/timeout.py", line 47, in wrapper
    raise exception_to_raise(f"A timeout error occurred. The function call took longer than {local_timeout_duration} second(s).")
openai.error.Timeout: A timeout error occurred. The function call took longer than 180 second(s).
idavidov commented 1 year ago

this one is new problem, i'll look on it as well, but if you able also reproduce original issue with debug verbosity it will be great

pushpanktugnawat commented 1 year ago

Yeah I am reproducing it and share you the logs soonish.

GadiZimerman commented 1 year ago

@pushpanktugnawat https://www.codium.ai/blog/hosted-pr-agent-for-gitlab-teams-enterprise/

myvo commented 12 months ago

I run with docker command bellow with:

mrT23 commented 12 months ago

@myvo i just tried the docker command. it works

Triple-check that you are using a valid gitlab PERSONAL_ACCESS_TOKEN. they sometimes expire, and you need to renew them https://docs.gitlab.com/ee/user/profile/personal_access_tokens.html

docker run --rm -it -e OPENAI.KEY=... -e CONFIG.GIT_PROVIDER=gitlab -e GITLAB.PERSONAL_ACCESS_TOKEN=... codiumai/pr-agent --pr_url=https://gitlab.com/codiumai/pr-agent/-/merge_requests/2 review

` WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested

: MADV_DONTNEED does not work (memset will be used instead) : (This is the expected behaviour if you are running under QEMU) INFO:root:Reviewing PR: https://gitlab.com/codiumai/pr-agent/-/merge_requests/2 ... INFO:root:Getting PR diff... INFO:root:Getting AI prediction... PR Analysis: Main theme: |- This PR primarily focuses on reverting some changes in the GithubProvider class and updating the README file. PR summary: |- The PR reverts changes in the GithubProvider class, specifically in the 'publish_code_suggestions' method. The method has been updated to handle exceptions better and include more information in the post parameters. Additionally, the README file has been updated with new image URLs and minor text changes. Type of PR: |- Bug fix Relevant tests added: |- No Estimated effort to review [1-5]: |- 2 The PR is not very large and the changes are straightforward. However, the changes in the 'publish_code_suggestions' method need to be carefully reviewed to ensure that they do not introduce any bugs. PR Feedback: General suggestions: |- The PR seems to be well-structured and the changes are clearly explained. However, it would be beneficial to add tests to verify the changes in the 'publish_code_suggestions' method. Also, it would be good to ensure that the updated URLs in the README file are working as expected. Code feedback: - relevant file: |- pr_agent/git_providers/github_provider.py suggestion: |- Consider using a more specific exception instead of the generic Exception. This can help in identifying and handling specific errors that may occur. [important] relevant line: |- + except Exception as e: - relevant file: |- pr_agent/git_providers/github_provider.py suggestion: |- It would be good to add a comment explaining why the 'commit_id' is being added to the post parameters. This can help in understanding the code better. [medium] relevant line: |- + "commit_id": self.last_commit_id._identity, Security concerns: |- No The changes in this PR do not seem to introduce any security concerns. However, it is always a good practice to review the code thoroughly for any potential security issues. stop`
rliskunov commented 11 months ago

@mrT23 Hello! I have the same GitlabAuthenticationError with similar environment (self host GitLab and private project)

I've verified token by cURL curl --header "PRIVATE-TOKEN: $GITLAB_PAT" "https://gitlab.example.com/api/v4/user" and it worked. How can I solve the 401 Unauthorized problem?

mrT23 commented 11 months ago

Hi @rliskunov

regarding the 401 issue, i am not sure. i tested Gitlab on a private project on my repo, and it works.

What I suggest is that you find some open-source repo, and try there. If it doesn't work, open a new issue, and share the repo URL and your logs I will also try there, and see what is the difference.

kristofenyi commented 6 months ago

Hi, did anyone resolve the 401 Unauthorized problem? I have also confirmed the token with

curl --header "PRIVATE-TOKEN: $GITLAB_PAT" "https://gitlab.example.com/api/v4/user"

yet i am getting

... {"text": "Failed to validate secret\n", ....
"POST /webhook HTTP/1.1" 401 Unauthorized

i have the secrets matching in the config

Abhishek-569 commented 2 months ago

Is there a docker command for model other than OpenAI for example Groq on gitlab?

paolomainardi commented 1 month ago

if you are facing troubles with private gitlab instances, remember to add this variable from a CI job:

gitlab__url=$CI_SERVER_PROTOCOL://$CI_SERVER_FQDN

or just pass it from the CLI with your gitlab instance host.

paolomainardi commented 1 month ago

Still a WIP but can be useful for someone, implemented as a gitlab component:

spec:
  inputs:
    job-prefix:
      description: "Define a prefix for the job name"
      default: codium
    stage:
      description: "Define the stage for the job"
      default: .pre
    pr-agent-version:
      description: "Define the version of the PR Agent to use"
      default: latest
    auto_review:
      description: "Enable/Disable auto review"
      type: boolean
      default: true
    auto_describe:
      description: "Enable/Disable auto describe"
      type: boolean
      default: true
    gitlab_pat:
      description: "GitLab Personal Access Token"
    auto_improve:
      description: "Enable/Disable auto improve"
      type: boolean
      default: true
    # Supported models: https://github.com/Codium-ai/pr-agent/blob/main/pr_agent/algo/__init__.py
    # At the moment we just support Openai and Anthropic.
    ai_model:
      description: "Model to use for the PR Agent, at the moment we support openai and anthropic."
      default: openai
    ai_key:
      description: "API Key for the AI Provider"
---
"$[[ inputs.job-prefix ]].mr-agent":
  stage: "$[[ inputs.stage ]]"
  image:
    name: codiumai/pr-agent:$[[ inputs.pr-agent-version ]]
    entrypoint: [""]
  script:
    - |
      cd /app
      echo "Running PR Agent action step"
      export MR_URL="$CI_MERGE_REQUEST_PROJECT_URL/merge_requests/$CI_MERGE_REQUEST_IID"
      echo "MR_URL=$MR_URL"
      export config__git_provider="gitlab"
      export gitlab__PERSONAL_ACCESS_TOKEN=$[[ inputs.gitlab_pat ]]
      export gitlab__url=$CI_SERVER_PROTOCOL://$CI_SERVER_FQDN
      export ai_model=$[[ inputs.ai_model ]]

      # Handle the AI provider key.
      if [ "$ai_model" == "openai" ]; then
        export openai__key=$[[ inputs.ai_key ]]
      elif [ "$ai_model" == "anthropic" ]; then
        echo "Using Anthropic model"
        export config__model="anthropic/claude-3-5-sonnet-20240620"
        export config__model_turbo="anthropic/claude-3-5-sonnet-20240620"
        export anthropic__key=$[[ inputs.ai_key ]]
      fi

      if $[[ inputs.auto_describe ]]; then
        python -m pr_agent.cli --pr_url="$MR_URL" describe
      fi
      if $[[ inputs.auto_review ]]; then
        python -m pr_agent.cli --pr_url="$MR_URL" review
      fi
      if $[[ inputs.auto_improve ]]; then
        python -m pr_agent.cli --pr_url="$MR_URL" improve
      fi
  allow_failure: true
  rules:
    - if: $CI_PIPELINE_SOURCE == "merge_request_event"