prompty = Prompty.load("chat.prompty", model={'configuration': model_config})
result = prompty(
chat_history=[
{"role": "user", "content": "Does Azure OpenAI support customer managed keys?"},
{"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."}
],
chat_input="Do other Azure AI services support this too?")
Yes, other Azure AI services also support various capabilities and features. Some of the Azure AI services include Azure Cognitive Services, Azure Machine Learning, Azure Bot Service, and Azure Databricks. Each of these services offers different AI capabilities and can be used for various use cases. If you have a specific service or capability in mind, feel free to ask for more details.
Describe the bug
error while follow the Build a custom chat app in Python using the prompt flow SDK to build a python
Related command
az login python chat.py load_dotenv() from promptflow.core import Prompty, AzureOpenAIModelConfiguration
Errors
`(.venv) (base) user@host MS_ai % python3 chat.py [2024-05-31 13:43:46 +0900][promptflow.core._prompty_utils][ERROR] - Exception occurs: NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}} Traceback (most recent call last): File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/promptflow/core/_prompty_utils.py", line 1003, in wrapper return func(args, kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/promptflow/core/_flow.py", line 451, in call response = send_request_to_llm(api_client, self._model.api, params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/promptflow/core/_prompty_utils.py", line 197, in send_request_to_llm result = client.chat.completions.create(parameters) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/openai/_utils/_utils.py", line 277, in wrapper return func(args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^ File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/openai/resources/chat/completions.py", line 590, in create return self._post( ^^^^^^^^^^^ File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1240, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 921, in request return self._request( ^^^^^^^^^^^^^^ File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/openai/_base_client.py", line 1020, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}}
During handling of the above exception, another exception occurred:
Traceback (most recent call last): File "/Users/user/dev/MS_ai/chat.py", line 19, in
result = prompty(
^^^^^^^^
File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/promptflow/tracing/_trace.py", line 513, in wrapped
output = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/Users/user/dev/MS_ai/.venv/lib/python3.12/site-packages/promptflow/core/_prompty_utils.py", line 1031, in wrapper
raise WrappedOpenAIError(e)
promptflow.core._errors.WrappedOpenAIError: OpenAI API hits NotFoundError: Error code: 404 - {'error': {'code': '404', 'message': 'Resource not found'}} `
Issue script & Debug output
chat.py `import os from dotenv import load_dotenv load_dotenv()
from promptflow.core import Prompty, AzureOpenAIModelConfiguration
enable promptflow tracing
from promptflow.tracing import start_trace
start_trace()
model_config = AzureOpenAIModelConfiguration( azure_deployment=os.getenv("AZURE_OPENAI_DEPLOYMENT_NAME"), api_version=os.getenv("AZURE_OPENAI_API_VERSION"), azure_endpoint=os.getenv("AZURE_OPENAI_ENDPOINT"),
azure_key=os.getenv("AZURE_OPENAI_KEY")
)
prompty = Prompty.load("chat.prompty", model={'configuration': model_config}) result = prompty( chat_history=[ {"role": "user", "content": "Does Azure OpenAI support customer managed keys?"}, {"role": "assistant", "content": "Yes, customer managed keys are supported by Azure OpenAI."} ], chat_input="Do other Azure AI services support this too?")
print(result)
Save chat history
chat_history = result['chat_history']
prompty.save_chat_history("chat.prompty", chat_history)
Print the result
print(result['response'])`
Expected behavior
Yes, other Azure AI services also support various capabilities and features. Some of the Azure AI services include Azure Cognitive Services, Azure Machine Learning, Azure Bot Service, and Azure Databricks. Each of these services offers different AI capabilities and can be used for various use cases. If you have a specific service or capability in mind, feel free to ask for more details.
Environment Summary
azure-cli 2.61.0
core 2.61.0 telemetry 1.1.0
Dependencies: msal 1.28.0 azure-mgmt-resource 23.1.1
Python location '/opt/homebrew/Cellar/azure-cli/2.61.0/libexec/bin/python' Extensions directory '/Users/user/.azure/cliextensions'
Python (Darwin) 3.11.9 (main, Apr 2 2024, 08:25:04) [Clang 15.0.0 (clang-1500.3.9.4)]
Legal docs and information: aka.ms/AzureCliLegal
Additional context
azure resources are deployed. backend azure AI service can be tested in Chat playground