Closed saket424 closed 4 months ago
@saket424 - Are you experiencing this issue with other Wikimedia images? Also, which OpenAI API Service Host provider are you using?
I am using this openai API compatible service host
https://github.com/matatonic/openedai-vision
And yes this issue seems to happen with other jpeg images as well as other wiki media images.
Ok - Any obvious image spec similarities between the ones that succeed vs. the images that fail? Go ahead and drop examples of each set here.
Also, you might want to try with another Service Host--OpenAI's official API is ideal for verification--and share your findings here.
@allanbunch When I try the official OpenAI as you suggest, all files work which absolves the node-red-openai-api node but instead is a problem with the openedai service host
Here is the error I am seeing in the service host
server-1 | INFO: 192.168.64.111:37098 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
server-1 | ERROR: Exception in ASGI application
server-1 | + Exception Group Traceback (most recent call last):
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 87, in collapse_excgroups
server-1 | | yield
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 190, in __call__
server-1 | | async with anyio.create_task_group() as task_group:
server-1 | | File "/usr/local/lib/python3.11/site-packages/anyio/_backends/_asyncio.py", line 680, in __aexit__
server-1 | | raise BaseExceptionGroup(
server-1 | | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
server-1 | +-+---------------- 1 ----------------
server-1 | | Traceback (most recent call last):
server-1 | | File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
server-1 | | result = await app( # type: ignore[func-returns-value]
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
server-1 | | return await self.app(scope, receive, send)
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
server-1 | | await super().__call__(scope, receive, send)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
server-1 | | await self.middleware_stack(scope, receive, send)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
server-1 | | raise exc
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
server-1 | | await self.app(scope, receive, _send)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
server-1 | | with collapse_excgroups():
server-1 | | File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
server-1 | | self.gen.throw(typ, value, traceback)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
server-1 | | raise exc
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
server-1 | | response = await self.dispatch_func(request, call_next)
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/app/openedai.py", line 126, in log_requests
server-1 | | response = await call_next(request)
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next
server-1 | | raise app_exc
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
server-1 | | await self.app(scope, receive_or_disconnect, send_no_error)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 85, in __call__
server-1 | | await self.app(scope, receive, send)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
server-1 | | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
server-1 | | raise exc
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
server-1 | | await app(scope, receive, sender)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
server-1 | | await self.middleware_stack(scope, receive, send)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
server-1 | | await route.handle(scope, receive, send)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
server-1 | | await self.app(scope, receive, send)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
server-1 | | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
server-1 | | raise exc
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
server-1 | | await app(scope, receive, sender)
server-1 | | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
server-1 | | response = await func(request)
server-1 | | ^^^^^^^^^^^^^^^^^^^
server-1 | | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
server-1 | | raw_response = await run_endpoint_function(
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
server-1 | | return await dependant.call(**values)
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/app/vision.py", line 87, in vision_chat_completions
server-1 | | text = await vision_qna.chat_with_images(request)
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/app/vision_qna.py", line 116, in chat_with_images
server-1 | | return ''.join([r async for r in self.stream_chat_with_images(request)])
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/app/vision_qna.py", line 116, in <listcomp>
server-1 | | return ''.join([r async for r in self.stream_chat_with_images(request)])
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/app/backend/llavanext.py", line 32, in stream_chat_with_images
server-1 | | images, prompt = await prompt_from_messages(request.messages, self.format)
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/app/vision_qna.py", line 704, in prompt_from_messages
server-1 | | return await known_formats[format](messages)
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/app/vision_qna.py", line 332, in llama2_prompt_from_messages
server-1 | | images.extend([ await url_to_image(c.image_url.url) ])
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/app/vision_qna.py", line 180, in url_to_image
server-1 | | return Image.open(io.BytesIO(img_data)).convert("RGB")
server-1 | | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | | File "/usr/local/lib/python3.11/site-packages/PIL/Image.py", line 3498, in open
server-1 | | raise UnidentifiedImageError(msg)
server-1 | | PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7a6e17b97920>
server-1 | +------------------------------------
server-1 |
server-1 | During handling of the above exception, another exception occurred:
server-1 |
server-1 | Traceback (most recent call last):
server-1 | File "/usr/local/lib/python3.11/site-packages/uvicorn/protocols/http/httptools_impl.py", line 399, in run_asgi
server-1 | result = await app( # type: ignore[func-returns-value]
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/usr/local/lib/python3.11/site-packages/uvicorn/middleware/proxy_headers.py", line 70, in __call__
server-1 | return await self.app(scope, receive, send)
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/usr/local/lib/python3.11/site-packages/fastapi/applications.py", line 1054, in __call__
server-1 | await super().__call__(scope, receive, send)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/applications.py", line 123, in __call__
server-1 | await self.middleware_stack(scope, receive, send)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 186, in __call__
server-1 | raise exc
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/errors.py", line 164, in __call__
server-1 | await self.app(scope, receive, _send)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 189, in __call__
server-1 | with collapse_excgroups():
server-1 | File "/usr/local/lib/python3.11/contextlib.py", line 158, in __exit__
server-1 | self.gen.throw(typ, value, traceback)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/_utils.py", line 93, in collapse_excgroups
server-1 | raise exc
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 191, in __call__
server-1 | response = await self.dispatch_func(request, call_next)
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/app/openedai.py", line 126, in log_requests
server-1 | response = await call_next(request)
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 165, in call_next
server-1 | raise app_exc
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/base.py", line 151, in coro
server-1 | await self.app(scope, receive_or_disconnect, send_no_error)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/cors.py", line 85, in __call__
server-1 | await self.app(scope, receive, send)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/middleware/exceptions.py", line 65, in __call__
server-1 | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
server-1 | raise exc
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
server-1 | await app(scope, receive, sender)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 756, in __call__
server-1 | await self.middleware_stack(scope, receive, send)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 776, in app
server-1 | await route.handle(scope, receive, send)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 297, in handle
server-1 | await self.app(scope, receive, send)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 77, in app
server-1 | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 64, in wrapped_app
server-1 | raise exc
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/_exception_handler.py", line 53, in wrapped_app
server-1 | await app(scope, receive, sender)
server-1 | File "/usr/local/lib/python3.11/site-packages/starlette/routing.py", line 72, in app
server-1 | response = await func(request)
server-1 | ^^^^^^^^^^^^^^^^^^^
server-1 | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 278, in app
server-1 | raw_response = await run_endpoint_function(
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/usr/local/lib/python3.11/site-packages/fastapi/routing.py", line 191, in run_endpoint_function
server-1 | return await dependant.call(**values)
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/app/vision.py", line 87, in vision_chat_completions
server-1 | text = await vision_qna.chat_with_images(request)
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/app/vision_qna.py", line 116, in chat_with_images
server-1 | return ''.join([r async for r in self.stream_chat_with_images(request)])
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/app/vision_qna.py", line 116, in <listcomp>
server-1 | return ''.join([r async for r in self.stream_chat_with_images(request)])
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/app/backend/llavanext.py", line 32, in stream_chat_with_images
server-1 | images, prompt = await prompt_from_messages(request.messages, self.format)
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/app/vision_qna.py", line 704, in prompt_from_messages
server-1 | return await known_formats[format](messages)
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/app/vision_qna.py", line 332, in llama2_prompt_from_messages
server-1 | images.extend([ await url_to_image(c.image_url.url) ])
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/app/vision_qna.py", line 180, in url_to_image
server-1 | return Image.open(io.BytesIO(img_data)).convert("RGB")
server-1 | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1 | File "/usr/local/lib/python3.11/site-packages/PIL/Image.py", line 3498, in open
server-1 | raise UnidentifiedImageError(msg)
server-1 | PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7a6e17b97920>
server-1 | INFO: 192.168.64.111:37110 - "POST /v1/chat/completions HTTP/1.1" 500 Internal Server Error
@allanbunch Thanks for this project. I am going to close the issue. Late this evening I successfully validated this nodered node with several locally hosted openai-compatible multimodal backends including llama.cpp.python , LMStudio, Ollama, OpenedAI-vision with smallish video models such as Florence2, moondream2 among others. Look forward to continued use of this node-red-openai-api node
@saket424 Thanks for troubleshooting the issue and sharing your findings. I'm sure your effort will benefit the community. Good luck with your AI projects, and thanks for your support.
I am having a weird error. Any ideas why? Both appear to be valid jpeg files renderable in the browser
I am using the Microsoft/Florence-2-large model with OpenedAI-vision
when I use this 1200px file I get a 500 error from the node
[{"role":"user","content":[{"type":"text","text":""},{"type":"image_url","image_url":{"url":"https://upload.wikimedia.org/wikipedia/commons/thumb/0/0e/ChrisLitherlandBourbonSt.jpg/1200px-ChrisLitherlandBourbonSt.jpg"}}]}]
7/1/2024, 4:10:54 PMnode: chat completion msg : string[25] "500 status code (no body)"
and when i use the 1280px file it works
[{"role":"user","content":[{"type":"text","text":""},{"type":"image_url","image_url":{"url":"https://upload.wikimedia.org/wikipedia/commons/thumb/0/0e/ChrisLitherlandBourbonSt.jpg/1280px-ChrisLitherlandBourbonSt.jpg"}}]}]
"In this image we can see buildings, street poles, street lights, name boards, motor vehicles on the road, persons standing on the floor and sky."