microsoft / promptflow

Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
https://microsoft.github.io/promptflow/
MIT License
9.18k stars 829 forks source link

stdout maxBuffer length exceeded [VSCode Extension] #1357

Closed anderl80 closed 9 months ago

anderl80 commented 9 months ago

Describe the bug I created a very simple flow that categorizes Email into spam or not spam. Flow runs in test mode, however, when I run a batch with JSONL file, the batch run is not shown. When I click on refresh an error pops up stdout maxBuffer length exceeded.

How To Reproduce the bug See above

Screenshots image image

Environment Information

Additional context Add any other context about the problem here.

devcontainer.json

{
    "name": "Python 3.11",
    "build": {
        "dockerfile": "Dockerfile",
        "context": "..",
        "args": { "VARIANT": "3.11" }
    },
    "features": {
        "ghcr.io/devcontainers/features/azure-cli:1": {},
        "ghcr.io/devcontainers/features/git:1": {}
    },
    "customizations": {
        "vscode": {
            "extensions": [
                "prompt-flow.prompt-flow",
                "GitHub.copilot-chat",
                "GitHub.copilot",
                "ms-azuretools.vscode-docker"
            ]
        }
    }
}

Dockerfile

FROM python:3.11-slim

# Make Python 3.11 the default
RUN update-alternatives --install /usr/bin/python python /usr/local/bin/python3.11 1

# Install pip for Python 3.11
RUN python -m ensurepip
RUN python -m pip install --upgrade pip

# Copy requirements.txt and install the packages
COPY requirements.txt .
RUN pip install -r requirements.txt

# Prioritize Python 3.11.6 in the PATH
ENV PATH="/path/to/python3.11.6/bin:${PATH}"

# Install Zsh and Oh My Zsh
RUN apt-get update && apt-get install -y zsh curl git
RUN apt-get install -y fonts-powerline
RUN sh -c "$(curl -fsSL https://raw.githubusercontent.com/ohmyzsh/ohmyzsh/master/tools/install.sh)"

# Install autocomplete and syntax-highlighting plugins
RUN git clone https://github.com/zsh-users/zsh-autosuggestions ${ZSH_CUSTOM:-~/.oh-my-zsh/custom}/plugins/zsh-autosuggestions
RUN git clone https://github.com/zsh-users/zsh-syntax-highlighting.git ${ZSH_CUSTOM:-~/.oh-my-zsh/custom}/plugins/zsh-syntax-highlighting
RUN git clone https://github.com/zsh-users/zsh-completions ${ZSH_CUSTOM:-~/.oh-my-zsh/custom}/plugins/zsh-completions

# Copy .zshrc to the root user's home directory
COPY .zshrc /root/

# Set Zsh as the default shell
ENV SHELL /bin/zsh

PF Output

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py\\\", line 153, in create\\n    response, _, api_key = requestor.request(\\n                           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 298, in request\\n    resp, got_stream = self._interpret_response(result, stream)\\n                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 700, in _interpret_response\\n    self._interpret_response_line(\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 765, in _interpret_response_line\\n    raise self.handle_error_response(\\n\",\n                      \"innerException\": null\n                    }\n                  }\n                }\n              },\n              {\n                \"line number\": 4154,\n                \"error\": {\n                  \"message\": \"OpenAI API hits InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]\",\n                  \"messageFormat\": \"\",\n                  \"messageParameters\": {},\n                  \"referenceCode\": \"ErrorTarget.TOOL/promptflow.tools.aoai\",\n                  \"code\": \"UserError\",\n                  \"innerError\": {\n                    \"code\": \"OpenAIError\",\n                    \"innerError\": {\n                      \"code\": \"InvalidRequestError\",\n                      \"innerError\": null\n                    }\n                  },\n                  \"debugInfo\": {\n                    \"type\": \"WrappedOpenAIError\",\n                    \"message\": \"OpenAI API hits InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]\",\n                    \"stackTrace\": \"\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 886, in _exec\\n    output, nodes_outputs = self._traverse_nodes(inputs, context)\\n                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 962, in _traverse_nodes\\n    nodes_outputs, bypassed_nodes = self._submit_to_scheduler(context, inputs, batch_nodes)\\n                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 982, in _submit_to_scheduler\\n    return FlowNodesScheduler(self._tools_manager, inputs, nodes, self._node_concurrency, context).execute()\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 67, in execute\\n    raise e\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 56, in execute\\n    self._dag_manager.complete_nodes(self._collect_outputs(completed_futures))\\n                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 87, in _collect_outputs\\n    each_node_result = each_future.result()\\n                       ^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/_base.py\\\", line 449, in result\\n    return self.__get_result()\\n           ^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/_base.py\\\", line 401, in __get_result\\n    raise self._exception\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/thread.py\\\", line 58, in run\\n    result = self.fn(*self.args, **self.kwargs)\\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 116, in _exec_single_node_in_thread\\n    result = f(**kwargs)\\n             ^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/tool.py\\\", line 77, in new_f\\n    return tool_invoker.invoke_tool(func, *args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_tool_invoker.py\\\", line 19, in invoke_tool\\n    return cur_flow.invoke_tool_with_cache(f, argnames, args, kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 126, in invoke_tool_with_cache\\n    result = self.invoke_tool(f, args, kwargs)\\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 170, in invoke_tool\\n    raise e\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 164, in invoke_tool\\n    return f(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/common.py\\\", line 194, in wrapper\\n    raise WrappedOpenAIError(e)\\n\",\n                    \"innerException\": {\n                      \"type\": \"InvalidRequestError\",\n                      \"message\": \"The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766\",\n                      \"stackTrace\": \"Traceback (most recent call last):\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/common.py\\\", line 153, in wrapper\\n    return func(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/aoai.py\\\", line 145, in chat\\n    completion = openai.ChatCompletion.create(**{**self._connection_dict, **params})\\n                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/openai_injector.py\\\", line 91, in wrapper\\n    return f(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/openai_injector.py\\\", line 45, in wrapped_method\\n    result = f(*args, **kwargs)\\n             ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_resources/chat_completion.py\\\", line 25, in create\\n    return super().create(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py\\\", line 153, in create\\n    response, _, api_key = requestor.request(\\n                           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 298, in request\\n    resp, got_stream = self._interpret_response(result, stream)\\n                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 700, in _interpret_response\\n    self._interpret_response_line(\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 765, in _interpret_response_line\\n    raise self.handle_error_response(\\n\",\n                      \"innerException\": null\n                    }\n                  }\n                }\n              },\n              {\n                \"line number\": 4195,\n                \"error\": {\n                  \"message\": \"OpenAI API hits InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]\",\n                  \"messageFormat\": \"\",\n                  \"messageParameters\": {},\n                  \"referenceCode\": \"ErrorTarget.TOOL/promptflow.tools.aoai\",\n                  \"code\": \"UserError\",\n                  \"innerError\": {\n                    \"code\": \"OpenAIError\",\n                    \"innerError\": {\n                      \"code\": \"InvalidRequestError\",\n                      \"innerError\": null\n                    }\n                  },\n                  \"debugInfo\": {\n                    \"type\": \"WrappedOpenAIError\",\n                    \"message\": \"OpenAI API hits InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]\",\n                    \"stackTrace\": \"\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 886, in _exec\\n    output, nodes_outputs = self._traverse_nodes(inputs, context)\\n                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 962, in _traverse_nodes\\n    nodes_outputs, bypassed_nodes = self._submit_to_scheduler(context, inputs, batch_nodes)\\n                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 982, in _submit_to_scheduler\\n    return FlowNodesScheduler(self._tools_manager, inputs, nodes, self._node_concurrency, context).execute()\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 67, in execute\\n    raise e\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 56, in execute\\n    self._dag_manager.complete_nodes(self._collect_outputs(completed_futures))\\n                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 87, in _collect_outputs\\n    each_node_result = each_future.result()\\n                       ^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/_base.py\\\", line 449, in result\\n    return self.__get_result()\\n           ^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/_base.py\\\", line 401, in __get_result\\n    raise self._exception\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/thread.py\\\", line 58, in run\\n    result = self.fn(*self.args, **self.kwargs)\\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 116, in _exec_single_node_in_thread\\n    result = f(**kwargs)\\n             ^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/tool.py\\\", line 77, in new_f\\n    return tool_invoker.invoke_tool(func, *args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_tool_invoker.py\\\", line 19, in invoke_tool\\n    return cur_flow.invoke_tool_with_cache(f, argnames, args, kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 126, in invoke_tool_with_cache\\n    result = self.invoke_tool(f, args, kwargs)\\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 170, in invoke_tool\\n    raise e\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 164, in invoke_tool\\n    return f(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/common.py\\\", line 194, in wrapper\\n    raise WrappedOpenAIError(e)\\n\",\n                    \"innerException\": {\n                      \"type\": \"InvalidRequestError\",\n                      \"message\": \"The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766\",\n                      \"stackTrace\": \"Traceback (most recent call last):\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/common.py\\\", line 153, in wrapper\\n    return func(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/aoai.py\\\", line 145, in chat\\n    completion = openai.ChatCompletion.create(**{**self._connection_dict, **params})\\n                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/openai_injector.py\\\", line 91, in wrapper\\n    return f(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/openai_injector.py\\\", line 45, in wrapped_method\\n    result = f(*args, **kwargs)\\n             ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_resources/chat_completion.py\\\", line 25, in create\\n    return super().create(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py\\\", line 153, in create\\n    response, _, api_key = requestor.request(\\n                           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 298, in request\\n    resp, got_stream = self._interpret_response(result, stream)\\n                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 700, in _interpret_response\\n    self._interpret_response_line(\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 765, in _interpret_response_line\\n    raise self.handle_error_response(\\n\",\n                      \"innerException\": null\n                    }\n                  }\n                }\n              },\n              {\n                \"line number\": 4199,\n                \"error\": {\n                  \"message\": \"OpenAI API hits InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]\",\n                  \"messageFormat\": \"\",\n                  \"messageParameters\": {},\n                  \"referenceCode\": \"ErrorTarget.TOOL/promptflow.tools.aoai\",\n                  \"code\": \"UserError\",\n                  \"innerError\": {\n                    \"code\": \"OpenAIError\",\n                    \"innerError\": {\n                      \"code\": \"InvalidRequestError\",\n                      \"innerError\": null\n                    }\n                  },\n                  \"debugInfo\": {\n                    \"type\": \"WrappedOpenAIError\",\n                    \"message\": \"OpenAI API hits InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]\",\n                    \"stackTrace\": \"\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 886, in _exec\\n    output, nodes_outputs = self._traverse_nodes(inputs, context)\\n                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 962, in _traverse_nodes\\n    nodes_outputs, bypassed_nodes = self._submit_to_scheduler(context, inputs, batch_nodes)\\n                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 982, in _submit_to_scheduler\\n    return FlowNodesScheduler(self._tools_manager, inputs, nodes, self._node_concurrency, context).execute()\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 67, in execute\\n    raise e\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 56, in execute\\n    self._dag_manager.complete_nodes(self._collect_outputs(completed_futures))\\n                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 87, in _collect_outputs\\n    each_node_result = each_future.result()\\n                       ^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/_base.py\\\", line 449, in result\\n    return self.__get_result()\\n           ^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/_base.py\\\", line 401, in __get_result\\n    raise self._exception\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/thread.py\\\", line 58, in run\\n    result = self.fn(*self.args, **self.kwargs)\\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 116, in _exec_single_node_in_thread\\n    result = f(**kwargs)\\n             ^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/tool.py\\\", line 77, in new_f\\n    return tool_invoker.invoke_tool(func, *args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_tool_invoker.py\\\", line 19, in invoke_tool\\n    return cur_flow.invoke_tool_with_cache(f, argnames, args, kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 126, in invoke_tool_with_cache\\n    result = self.invoke_tool(f, args, kwargs)\\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 170, in invoke_tool\\n    raise e\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 164, in invoke_tool\\n    return f(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/common.py\\\", line 194, in wrapper\\n    raise WrappedOpenAIError(e)\\n\",\n                    \"innerException\": {\n                      \"type\": \"InvalidRequestError\",\n                      \"message\": \"The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766\",\n                      \"stackTrace\": \"Traceback (most recent call last):\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/common.py\\\", line 153, in wrapper\\n    return func(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/aoai.py\\\", line 145, in chat\\n    completion = openai.ChatCompletion.create(**{**self._connection_dict, **params})\\n                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/openai_injector.py\\\", line 91, in wrapper\\n    return f(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/openai_injector.py\\\", line 45, in wrapped_method\\n    result = f(*args, **kwargs)\\n             ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_resources/chat_completion.py\\\", line 25, in create\\n    return super().create(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_resources/abstract/engine_api_resource.py\\\", line 153, in create\\n    response, _, api_key = requestor.request(\\n                           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 298, in request\\n    resp, got_stream = self._interpret_response(result, stream)\\n                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 700, in _interpret_response\\n    self._interpret_response_line(\\n  File \\\"/usr/local/lib/python3.11/site-packages/openai/api_requestor.py\\\", line 765, in _interpret_response_line\\n    raise self.handle_error_response(\\n\",\n                      \"innerException\": null\n                    }\n                  }\n                }\n              },\n              {\n                \"line number\": 4269,\n                \"error\": {\n                  \"message\": \"OpenAI API hits InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]\",\n                  \"messageFormat\": \"\",\n                  \"messageParameters\": {},\n                  \"referenceCode\": \"ErrorTarget.TOOL/promptflow.tools.aoai\",\n                  \"code\": \"UserError\",\n                  \"innerError\": {\n                    \"code\": \"OpenAIError\",\n                    \"innerError\": {\n                      \"code\": \"InvalidRequestError\",\n                      \"innerError\": null\n                    }\n                  },\n                  \"debugInfo\": {\n                    \"type\": \"WrappedOpenAIError\",\n                    \"message\": \"OpenAI API hits InvalidRequestError: The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766 [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]\",\n                    \"stackTrace\": \"\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 886, in _exec\\n    output, nodes_outputs = self._traverse_nodes(inputs, context)\\n                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 962, in _traverse_nodes\\n    nodes_outputs, bypassed_nodes = self._submit_to_scheduler(context, inputs, batch_nodes)\\n                                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py\\\", line 982, in _submit_to_scheduler\\n    return FlowNodesScheduler(self._tools_manager, inputs, nodes, self._node_concurrency, context).execute()\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 67, in execute\\n    raise e\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 56, in execute\\n    self._dag_manager.complete_nodes(self._collect_outputs(completed_futures))\\n                                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 87, in _collect_outputs\\n    each_node_result = each_future.result()\\n                       ^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/_base.py\\\", line 449, in result\\n    return self.__get_result()\\n           ^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/_base.py\\\", line 401, in __get_result\\n    raise self._exception\\n  File \\\"/usr/local/lib/python3.11/concurrent/futures/thread.py\\\", line 58, in run\\n    result = self.fn(*self.args, **self.kwargs)\\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py\\\", line 116, in _exec_single_node_in_thread\\n    result = f(**kwargs)\\n             ^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/tool.py\\\", line 77, in new_f\\n    return tool_invoker.invoke_tool(func, *args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/executor/_tool_invoker.py\\\", line 19, in invoke_tool\\n    return cur_flow.invoke_tool_with_cache(f, argnames, args, kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 126, in invoke_tool_with_cache\\n    result = self.invoke_tool(f, args, kwargs)\\n             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 170, in invoke_tool\\n    raise e\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py\\\", line 164, in invoke_tool\\n    return f(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/common.py\\\", line 194, in wrapper\\n    raise WrappedOpenAIError(e)\\n\",\n                    \"innerException\": {\n                      \"type\": \"InvalidRequestError\",\n                      \"message\": \"The response was filtered due to the prompt triggering Azure OpenAI's content management policy. Please modify your prompt and retry. To learn more about our content filtering policies please read our documentation: https://go.microsoft.com/fwlink/?linkid=2198766\",\n                      \"stackTrace\": \"Traceback (most recent call last):\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/common.py\\\", line 153, in wrapper\\n    return func(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/tools/aoai.py\\\", line 145, in chat\\n    completion = openai.ChatCompletion.create(**{**self._connection_dict, **params})\\n                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/openai_injector.py\\\", line 91, in wrapper\\n    return f(*args, **kwargs)\\n           ^^^^^^^^^^^^^^^^^^\\n  File \\\"/usr/local/lib/python3.11/site-packages/promptflow/_core/openai_injector.py\\\", line 45, in wrapped_method\\n    result = f(*args, **kwargs)\\n             ^^^^^^^^^^^^^^^^^^\\n  File","stderr":""}
JYC-99 commented 9 months ago

Fixed with extension version 1.7.0