Open aramfaghfouri opened 1 day ago
I see, I think that the issue is that the API base is not flowing through to LiteLLM.
Can you double check the documentation on configuring LLMs here and be sure you are following all the steps to use Azure - https://r2r-docs.sciphi.ai/documentation/configuration/llm
If this does not debug for you, we can look into replicating on our end.
Thank you for your response. I have tried that in a few ways and none of them worked. I would appreciate it if you could replicate it as well. Thanks again.
r2r cannot work with Azure Embedding I am trying to use the Azure OpneAI models for completion and embedding for a full deployment using docker.
To Reproduce Here is my file:
Here is my command:
Expected behavior The docker works perfectly and I can upload a .pdf file. However, the workflow fails. When I inspected the docker logs, this is what I get:
I should mention that the exact same config perfectly works for litellm for both embedding and completion.
I am on a Macbook Pro.
I would appreciate your help. Thanks.