Open Pratekh opened 5 months ago
@Pratekh : can you provide more details on the config (e.g., the content of config.yml
)? This is probably a configuration issue.
%%writefile config/config.yml
type: main
engine: azure
model: gpt-35-turbo-16k
parameters:
azure_endpoint: https://abc.openai.azure.com/
api_version: 2023-07-01-preview
deployment_name: abc
api_key: 00000000000000000000000
How can I fix this issue?
Any update on this issue, we are also using Azure Open AI, getting same error while running nemoguardrails server
@rohitk-cognizant : to debug this, can you try to get a functional code where you initialize the LLM separately, in a "pure LangChain way".
model = AzureOpenAI(...)
print(model.invoke("test prompt")
And then try to pass that directly to the LLMRails instance
rails = LLMRails(config=config, llm=model)
response = rails.generate(messages=[{
"role": "user",
"content": "What is the capital of France?"
}])
print(response["content"])
If this works, then I can point you to where exactly you can check how the AzureOpenAI
engine is initialized and we can check the difference in parameters.
response = rails.generate(messages=[{ "role": "user", "content": "What is the capital of France?" }]) print(response["content"])
WARNING: nemoguardrails.actions.action_dispatcher:Error while execution generate_user_intent: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
I'm sorry, an internal error has occurred.