where <image URL> can be a “normal” URL like "https://somewhere.tld/filepath.png" or an inline URL like "data:image/jpeg;base64,{base64_image}".
Multiple images can be in the same "content" array or in multiple "user" messages, experiments say. With this change, we are sending them in one "user" message.
The image encoding for OpenAI is the same as for Azure.
messageHistory
to be agnostic of the image encoding.openAIChat
,azureChat
, andollamaChat
.As for the image encoding, Azure expects messages like this:
where
<image URL>
can be a “normal” URL like"https://somewhere.tld/filepath.png"
or an inline URL like"data:image/jpeg;base64,{base64_image}"
.Multiple images can be in the same
"content"
array or in multiple"user"
messages, experiments say. With this change, we are sending them in one"user"
message.The image encoding for OpenAI is the same as for Azure.
Ollama wants to get images in a different format: