r-sugi / nextjs-tdd-template

https://nextjs-tdd-templatestorybook-rsugis-projects.vercel.app
0 stars 0 forks source link

CI: add chatgptによるコードレビュー #150

Closed r-sugi closed 3 months ago

r-sugi commented 3 months ago

doc: https://github.com/Codium-ai/pr-agent?tab=readme-ov-file

r-sugi commented 3 months ago

https://github.com/r-sugi/nextjs-tdd-template/actions/runs/10500782475/job/29090061857

{"text": "Error during OpenAI inference: \n", "record": {"elapsed": {"repr": "0:00:14.156999", "seconds": 14.156999}, "exception": null, "extra": {}, "file": {"name": "litellm_ai_handler.py", "path": "/app/pr_agent/algo/ai_handlers/litellm_ai_handler.py"}, "function": "chat_completion", "level": {"icon": "⚠️", "name": "WARNING", "no": 30}, "line": 154, "message": "Error during OpenAI inference: ", "module": "litellm_ai_handler", "name": "pr_agent.algo.ai_handlers.litellm_ai_handler", "process": {"id": 7, "name": "MainProcess"}, "thread": {"id": 139791585168256, "name": "MainThread"}, "time": {"repr": "2024-08-22 02:43:34.884252+00:00", "timestamp": 1724294614.884252}}}
{"text": "Failed to generate prediction with gpt-4o-2024-05-13: Traceback (most recent call last):\n  File \"/usr/local/lib/python3.10/site-packages/litellm/main.py\", line 401, in acompletion\n    response = await init_response\n  File \"/usr/local/lib/python3.10/site-packages/litellm/llms/openai.py\", line 1124, in acompletion\n    raise e\n  File \"/usr/local/lib/python3.10/site-packages/litellm/llms/openai.py\", line 1079, in acompletion\n    headers, response = await self.make_openai_chat_completion_request(\n  File \"/usr/local/lib/python3.10/site-packages/litellm/llms/openai.py\", line 781, in make_openai_chat_completion_request\n    raise e\n  File \"/usr/local/lib/python3.10/site-packages/litellm/llms/openai.py\", line 772, in make_openai_chat_completion_request\n    await openai_aclient.chat.completions.with_raw_response.create(\n  File \"/usr/local/lib/python3.10/site-packages/openai/_legacy_response.py\", line 367, in wrapped\n    return cast(LegacyAPIResponse[R], await func(*args, **kwargs))\n  File \"/usr/local/lib/python3.10/site-packages/openai/resources/chat/completions.py\", line 1339, in create\n    return await self._post(\n  File \"/usr/local/lib/python3.10/site-packages/openai/_base_client.py\", line 1815, in post\n    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)\n  File \"/usr/local/lib/python3.10/site-packages/openai/_base_client.py\", line 1509, in request\n    return await self._request(\n  File \"/usr/local/lib/python3.10/site-packages/openai/_base_client.py\", line 1610, in _request\n    raise self._make_status_error_from_response(err.response) from None\nopenai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-4o-2024-05-13` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File \"/usr/local/lib/python3.10/site-packages/tenacity/_asyncio.py\", line 50, in __call__\n    result = await fn(*args, **kwargs)\n  File \"/app/pr_agent/algo/ai_handlers/litellm_ai_handler.py\", line 152, in chat_completion\n    response = await acompletion(**kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/litellm/utils.py\", line 1579, in wrapper_async\n    raise e\n  File \"/usr/local/lib/python3.10/site-packages/litellm/utils.py\", line 1399, in wrapper_async\n    result = await original_function(*args, **kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/litellm/main.py\", line 424, in acompletion\n    raise exception_type(\n  File \"/usr/local/lib/python3.10/site-packages/litellm/utils.py\", line 8301, in exception_type\n    raise e\n  File \"/usr/local/lib/python3.10/site-packages/litellm/utils.py\", line 6558, in exception_type\n    raise NotFoundError(\nlitellm.exceptions.NotFoundError: litellm.NotFoundError: OpenAIException - The model `gpt-4o-2024-05-13` does not exist or you do not have access to it.\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n  File \"/app/pr_agent/algo/pr_processing.py\", line 329, in retry_with_fallback_models\n    return await f(model)\n  File \"/app/pr_agent/tools/pr_code_suggestions.py\", line 513, in _prepare_prediction_extended\n    prediction_list = await asyncio.gather(\n  File \"/app/pr_agent/tools/pr_code_suggestions.py\", line 315, in _get_prediction\n    response, finish_reason = await self.ai_handler.chat_completion(\n  File \"/usr/local/lib/python3.10/site-packages/tenacity/_asyncio.py\", line 88, in async_wrapped\n    return await fn(*args, **kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/tenacity/_asyncio.py\", line 47, in __call__\n    do = self.iter(retry_state=retry_state)\n  File \"/usr/local/lib/python3.10/site-packages/tenacity/__init__.py\", line 326, in iter\n    raise retry_exc from fut.exception()\ntenacity.RetryError: RetryError[<Future at 0x7f23b82ac2e0 state=finished raised NotFoundError>]\n\n", "record": {"elapsed": {"repr": "0:00:14.157732", "seconds": 14.157732}, "exception": null, "extra": {}, "file": {"name": "pr_processing.py", "path": "/app/pr_agent/algo/pr_processing.py"}, "function": "retry_with_fallback_models", "level": {"icon": "⚠️", "name": "WARNING", "no": 30}, "line": 331, "message": "Failed to generate prediction with gpt-4o-2024-05-13: Traceback (most recent call last):\n  File \"/usr/local/lib/python3.10/site-packages/litellm/main.py\", line 401, in acompletion\n    response = await init_response\n  File \"/usr/local/lib/python3.10/site-packages/litellm/llms/openai.py\", line 1124, in acompletion\n    raise e\n  File \"/usr/local/lib/python3.10/site-packages/litellm/llms/openai.py\", line 1079, in acompletion\n    headers, response = await self.make_openai_chat_completion_request(\n  File \"/usr/local/lib/python3.10/site-packages/litellm/llms/openai.py\", line 781, in make_openai_chat_completion_request\n    raise e\n  File \"/usr/local/lib/python3.10/site-packages/litellm/llms/openai.py\", line 772, in make_openai_chat_completion_request\n    await openai_aclient.chat.completions.with_raw_response.create(\n  File \"/usr/local/lib/python3.10/site-packages/openai/_legacy_response.py\", line 367, in wrapped\n    return cast(LegacyAPIResponse[R], await func(*args, **kwargs))\n  File \"/usr/local/lib/python3.10/site-packages/openai/resources/chat/completions.py\", line 1339, in create\n    return await self._post(\n  File \"/usr/local/lib/python3.10/site-packages/openai/_base_client.py\", line 1815, in post\n    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)\n  File \"/usr/local/lib/python3.10/site-packages/openai/_base_client.py\", line 1509, in request\n    return await self._request(\n  File \"/usr/local/lib/python3.10/site-packages/openai/_base_client.py\", line 1610, in _request\n    raise self._make_status_error_from_response(err.response) from None\nopenai.NotFoundError: Error code: 404 - {'error': {'message': 'The model `gpt-4o-2024-05-13` does not exist or you do not have access to it.', 'type': 'invalid_request_error', 'param': None, 'code': 'model_not_found'}}\n\nDuring handling of the above exception, another exception occurred:\n\nTraceback (most recent call last):\n  File \"/usr/local/lib/python3.10/site-packages/tenacity/_asyncio.py\", line 50, in __call__\n    result = await fn(*args, **kwargs)\n  File \"/app/pr_agent/algo/ai_handlers/litellm_ai_handler.py\", line 152, in chat_completion\n    response = await acompletion(**kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/litellm/utils.py\", line 1579, in wrapper_async\n    raise e\n  File \"/usr/local/lib/python3.10/site-packages/litellm/utils.py\", line 1399, in wrapper_async\n    result = await original_function(*args, **kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/litellm/main.py\", line 424, in acompletion\n    raise exception_type(\n  File \"/usr/local/lib/python3.10/site-packages/litellm/utils.py\", line 8301, in exception_type\n    raise e\n  File \"/usr/local/lib/python3.10/site-packages/litellm/utils.py\", line 6558, in exception_type\n    raise NotFoundError(\nlitellm.exceptions.NotFoundError: litellm.NotFoundError: OpenAIException - The model `gpt-4o-2024-05-13` does not exist or you do not have access to it.\n\nThe above exception was the direct cause of the following exception:\n\nTraceback (most recent call last):\n  File \"/app/pr_agent/algo/pr_processing.py\", line 329, in retry_with_fallback_models\n    return await f(model)\n  File \"/app/pr_agent/tools/pr_code_suggestions.py\", line 513, in _prepare_prediction_extended\n    prediction_list = await asyncio.gather(\n  File \"/app/pr_agent/tools/pr_code_suggestions.py\", line 315, in _get_prediction\n    response, finish_reason = await self.ai_handler.chat_completion(\n  File \"/usr/local/lib/python3.10/site-packages/tenacity/_asyncio.py\", line 88, in async_wrapped\n    return await fn(*args, **kwargs)\n  File \"/usr/local/lib/python3.10/site-packages/tenacity/_asyncio.py\", line 47, in __call__\n    do = self.iter(retry_state=retry_state)\n  File \"/usr/local/lib/python3.10/site-packages/tenacity/__init__.py\", line 326, in iter\n    raise retry_exc from fut.exception()\ntenacity.RetryError: RetryError[<Future at 0x7f23b82ac2e0 state=finished raised NotFoundError>]\n", "module": "pr_processing", "name": "pr_agent.algo.pr_processing", "process": {"id": 7, "name": "MainProcess"}, "thread": {"id": 139791585168256, "name": "MainThread"}, "time": {"repr": "2024-08-22 02:43:34.884985+00:00", "timestamp": 1724294614.884985}}}
{"text": "Failed to generate code suggestions for PR, error: RetryError[<Future at 0x7f23b82ac2e0 state=finished raised NotFoundError>]\n", "record": {"elapsed": {"repr": "0:00:14.157967", "seconds": 14.157967}, "exception": null, "extra": {}, "file": {"name": "pr_code_suggestions.py", "path": "/app/pr_agent/tools/pr_code_suggestions.py"}, "function": "run", "level": {"icon": "❌", "name": "ERROR", "no": 40}, "line": 170, "message": "Failed to generate code suggestions for PR, error: RetryError[<Future at 0x7f23b82ac2e0 state=finished raised NotFoundError>]", "module": "pr_code_suggestions", "name": "pr_agent.tools.pr_code_suggestions", "process": {"id": 7, "name": "MainProcess"}, "thread": {"id": 139791585168256, "name": "MainThread"}, "time": {"repr": "2024-08-22 02:43:34.885220+00:00", "timestamp": 1724294614.88522}}}
r-sugi commented 3 months ago

課金したら動作した: https://github.com/r-sugi/nextjs-tdd-template/pull/153

$10課金後に1回分消費した スクリーンショット 2024-08-22 13 10 06