Closed ZhiliangWu closed 2 months ago
Hi @ZhiliangWu - thanks for reaching out. Can you confirm which tool you are using for accessing the AzureML model catalog endpoint? The "LLM" tool is currently only used for OpenAI and Azure OpenAI connections. To use AzureML model catalog endpoints you can use the Open_Model_LLM Tool (found under "+ More tools").
To set expectations, the Custom Connection type is generic for different tools, for the Open Model LLM tool, the connection needs the following values:
endpoint_url = "
Please give it a try and let me know.
The error The API 'None.None' is not found
raised because no relevant api is found for custom connection, the message is vague, and we will improve it. And for llm node, like Gerard said, we don't support custom connection for now.
/@gjwoods thanks for your quick reply. I also checked the related docs at https://microsoft.github.io/promptflow/reference/tools-reference/open_model_llm_tool.html#open-model-llm. The following code works to create the custom connection with pf connection create -f custom_connection.yaml
$schema: https://azuremlschemas.azureedge.net/promptflow/latest/CustomConnection.schema.json
name: custom_connection
type: custom
configs:
endpoint_url: <url>
model_family: GPT2
secrets:
endpoint_api_key: <key>
Meanwhile, I met some issues to run it with the node with type Open_Model_LLM
having configurations as
- name: Open_Model_LLM
type: custom_llm
source:
type: package_with_prompt
tool: promptflow.tools.open_model_llm.OpenModelLLM.call
path: Open_Model_LLM.jinja2
inputs:
api: completion
instruction: ${inputs.instruction}
input: ${inputs.input}
endpoint_name: custom_connection #not sure whether this is correct
The error says
2024-01-17 23:09:20 +0100 20192 execution.flow INFO [Open_Model_LLM in line 0 (index starts from 0)] stdout> Executing Open Model LLM Tool for endpoint: 'custom_connection', deployment: 'None'
2024-01-17 23:09:20 +0100 20192 execution ERROR Node Open_Model_LLM in line 0 failed. Exception: Execution failure in 'Open_Model_LLM': (IndexError) list index out of range.
...
in parse_endpoint_connection_type
return (endpoint_connection_details[0].lower(), endpoint_connection_details[1])
IndexError: list index out of range
Do you know if I need to do any futher steps to configure this tool?
@brynn-code Comments on the feedback above would be appreciated!
@gjwoods
Hi, we're sending this friendly reminder because we haven't heard back from you in 30 days. We need more information about this issue to help address it. Please be sure to give us your input. If we don't hear back from you within 7 days of this comment, the issue will be automatically closed. Thank you!
@gjwoods
I would like to keep this issue open if possible. I too am seeing the index out of range error related to my Open Model LLM node that is trying to connect to my LLama 2 pay as you go deployed instance. Does anyone have any insights?
As a follow-up I also removed trying to use the connection in my Azure Workspace and recreated my Llama connection locally and I still get the same error.
Describe the bug I developed using Promptflow locally, and it works well with Azure OpenAI connections. However, the custom connection failed. I created an endpoint of gpt2 using one of the models in AML model catalog. The endpoint is deployed successfully, and I filled the example yaml from the docs with the
REST endpoint
andPrimary key
in theConsume
page.I added it to the Promptflow and the run failed with
How To Reproduce the bug Steps to reproduce the behavior, how frequent can you experience the bug:
custom_connection.yaml
with the values from the deployed endpointpf connection create -f custom_connection.yaml
Expected behavior It should work as connections like Azure OpenAI connections. This blocks all users from using available models from the model category with promptflow.
Running Information(please complete the following information):
pf -v
: 1.3.0python --version
: 3.9.10