mealie-recipes / mealie

Mealie is a self hosted recipe manager and meal planner with a RestAPI backend and a reactive frontend application built in Vue for a pleasant user experience for the whole family. Easily add recipes into your database by providing the url and mealie will automatically import the relevant data or add a family recipe with the UI editor
https://docs.mealie.io
GNU Affero General Public License v3.0
5.59k stars 608 forks source link

[BUG] - OpenAI ingredient parser fails in some cases #3792

Closed simgunz closed 3 days ago

simgunz commented 4 days ago

First Check

What is the issue you are experiencing?

For some recipes the OpenAI ingredient parse is failing. No error is reported and no ingredient is displayed.

The error should be at least catched and an error displayed to the user.

For the provided recipe the first ingredient seems the one causing the issues: 1 Confezione di Filetti di Merluzzo Surgelati (4 da 100 grammi ciascuno)

Steps to Reproduce

  1. Edit a recipe (https://www.cucinare.it/ricetta/filetti-di-merluzzo-surgelati-alla-pizzaiola)
  2. Parse ingredients
  3. Select OpenAI as ingredient parser once
  4. Click "Parse all"

GPT model: gpt-3.5-turbo

Please provide relevant logs

[...] The above exception was the direct cause of the following exception:

Traceback (most recent call last): File "/app/mealie/services/openai/openai.py", line 123, in get_response response = await self._get_raw_response(prompt, message, temperature, force_json_response) File "/app/mealie/services/openai/openai.py", line 104, in _get_raw_response return await client.chat.completions.create( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1214, in create return await self._post( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1790, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1493, in request return await self._request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1531, in _request return await self._retry_request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1615, in _retry_request return await self._request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1531, in _request return await self._retry_request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1615, in _retry_request return await self._request( File "/opt/pysetup/.venv/lib/python3.10/site-packages/openai/_base_client.py", line 1541, in _request raise APITimeoutError(request=request) from err openai.APITimeoutError: Request timed out. ERROR 2024-06-25T08:58:03 - Exception in ASGI application Traceback (most recent call last): File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi result = await app( # type: ignore[func-returns-value] File "/opt/pysetup/.venv/lib/python3.10/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in call return await self.app(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/applications.py", line 1054, in call await super().call(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/applications.py", line 123, in call await self.middleware_stack(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 186, in call raise exc File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/errors.py", line 164, in call await self.app(scope, receive, _send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 24, in call await responder(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/gzip.py", line 44, in call await self.app(scope, receive, self.send_with_gzip) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/middleware/exceptions.py", line 65, in call await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 756, in call await self.middleware_stack(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 776, in app await route.handle(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 297, in handle await self.app(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 77, in app await wrap_app_handling_exceptions(app, request)(scope, receive, send) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app raise exc File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app await app(scope, receive, sender) File "/opt/pysetup/.venv/lib/python3.10/site-packages/starlette/routing.py", line 72, in app response = await func(request) File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 278, in app raw_response = await run_endpoint_function( File "/opt/pysetup/.venv/lib/python3.10/site-packages/fastapi/routing.py", line 191, in run_endpoint_function return await dependant.call(**values) File "/app/mealie/routes/parser/ingredient_parser.py", line 16, in parse_ingredients return await parser.parse(ingredients.ingredients) File "/app/mealie/services/parser_services/openai/parser.py", line 99, in parse response = await self._parse(ingredients) File "/app/mealie/services/parser_services/openai/parser.py", line 84, in _parse responses = [ File "/app/mealie/services/parser_services/openai/parser.py", line 85, in OpenAIIngredients.model_validate_json(response_json) for response_json in responses_json if responses_json File "/opt/pysetup/.venv/lib/python3.10/site-packages/pydantic/main.py", line 580, in model_validate_json return cls.__pydantic_validator__.validate_json(json_data, strict=strict, context=context) pydantic_core._pydantic_core.ValidationError: 1 validation error for OpenAIIngredients JSON input should be string, bytes or bytearray [type=json_type, input_value=None, input_type=NoneType] For further information visit https://errors.pydantic.dev/2.7/v/json_type INFO 2024-06-25T08:58:29 - [127.0.0.1:34876] 200 OK "GET /api/app/about HTTP/1.1" INFO 2024-06-25T08:59:00 - [127.0.0.1:55114] 200 OK "GET /api/app/about HTTP/1.1" INFO 2024-06-25T08:59:32 - [127.0.0.1:48592] 200 OK "GET /api/app/about HTTP/1.1"

Mealie Version

v1.9.0 d96c36333b9cb9461c5dee96ae28b60d912b38fd

Deployment

Docker (Linux)

Additional Deployment Details

No response

michael-genson commented 4 days ago

Do you have any units in your unit store? Is it related to this?

https://github.com/mealie-recipes/mealie/pull/3769

michael-genson commented 4 days ago

I'm unable to reproduce this on nightly using 3.5-turbo, I got a response back. I will say that in the past I've had issues with 3.5, particularly with following directions, so you might consider bumping up to 4o instead.

Happy to keep this open to implement an alert that parsing failed, though

simgunz commented 3 days ago

The same happens with 4o.

I have tried these steps on a fresh docker image:

Other things I have tried in the problematic instance:

INFO     2024-06-25T19:21:09 - Retrying request to /chat/completions in 0.900241 seconds
INFO     2024-06-25T19:21:09 - Retrying request to /chat/completions in 0.926891 seconds
INFO     2024-06-25T19:21:10 - HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
INFO     2024-06-25T19:21:10 - HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
INFO     2024-06-25T19:21:15 - Retrying request to /chat/completions in 1.943179 seconds
INFO     2024-06-25T19:21:15 - Retrying request to /chat/completions in 1.874338 seconds
INFO     2024-06-25T19:21:22 - HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
INFO     2024-06-25T19:21:23 - HTTP Request: POST https://api.openai.com/v1/chat/completions "HTTP/1.1 200 OK"
michael-genson commented 3 days ago

What value is OPENAI_WORKERS set to? Also, do you have a usage limit or token limit that might be preventing requests from going through?

simgunz commented 1 day ago

I have the default value for OPENAI_WORKERS=2.

Doing some further debugging the error seems related to the speed of the hardware somehow. The problem happens on a raspberrypi 4. If I copy the docker data/ folder to an instance of docker on my local notebook everything works. Is it there some requests timeout that can be tuned?

michael-genson commented 1 day ago

Huh, weird that it's hardware dependent. But yeah I can definitely make the timeout configurable