Closed sonic182 closed 1 week ago
From your notebook
connector_res = do_request("POST", "/_plugins/_ml/connectors/_create",
{
"name": "Local app connector",
"description": "The connector",
"version": 1,
"protocol": "http",
"actions": [
{
"action_type": "predict",
"method": "POST",
"url": "http://ai_inference:8080/invocations",
"headers": {
"content-type": "application/json"
},
"post_process_function": "connector.post_process.default.embedding",
"request_body": "{ \"text\": ${parameters.input} }",
}
]
})
The connector is using http://ai_inference:8080/invocations
. @sonic182 Can you confirm this URL is correct and can be invoked insider your OpenSearch docker?
Hi @ylwu-amzn
Yes, the service is running a python http server with a model, same docker network in port 8080
Snippet of the service in docker
services:
ai_inference:
build: .
volumes:
- ./inference_app/:/opt/app/
- models:/models
It doesn't receive any requests
Can you verify if you can call the URL http://ai_inference:8080/invocations
directly inside your OpenSearch docker with curl?
@sonic182 This is a JDK bug which doesn't support underscore in host name: https://bugs.openjdk.org/browse/JDK-8221675, if possible, please change your host name to w/o underscore and should able to solve this.
@sonic182 This is a JDK bug which doesn't support underscore in host name: https://bugs.openjdk.org/browse/JDK-8221675, if possible, please change your host name to w/o underscore and should able to solve this.
Thanks @zane-neo, now it works, I've just changed from underscore to dash "ai-inference" :+1:
This may be fixed inside opensearch? by using a custom dns resolver maybe?
Well the error is not in dns, is in the URL class
Maybe the Workaround mentioned in the bug could be applied here -> https://github.com/opensearch-project/ml-commons/blob/0b9708b831f87ac6eb35b5c3039c61afe9425d26/ml-algorithms/src/main/java/org/opensearch/ml/engine/algorithms/remote/ConnectorUtils.java#L295
@sonic182 In OpenSearch we're using aws sdk client which accepts URI as parameter, so we're not going to implement the workaround.
Closing this issue since this has been resolved.
What is the bug? I'm trying to use an external model for text embeddings and I'm getting "host must not be null"
How can one reproduce the bug?
Please see this jupyter notebook: https://github.com/sonic182/sample_os_error/blob/main/opensearch_remote_model.ipynb
reference os server (docker-compose): https://github.com/sonic182/sample_os_error/blob/main/docker-compose.yml
What is the expected behavior?
Host must be detected based on URL of the connector
What is your host/environment?
Do you have any screenshots? An stack trace:
Do you have any additional context?
The "ai_inference" host is another docker container in the same network, for testing in local...