elastic / kibana

Your window into the Elastic Stack
https://www.elastic.co/products/kibana
Other
19.71k stars 8.13k forks source link

Ingest Pipeline UI cannot use Inference API style Inference Processor #178247

Open seanstory opened 6 months ago

seanstory commented 6 months ago

Kibana version: 8.14.0-SNAPSHOT

Elasticsearch version: 8.14.0-SNAPSHOT

Server OS version: OSX 14.3

Original install method (e.g. download page, yum, from source, etc.): source

Describe the bug: When using the UI to create an ingest pipeline to do inference, you can only select hosted models (like ones added by eland) but you cannot use the Inference Endpoints created by the Inference API.

You can create such a pipeline directly with:

PUT _ingest/pipeline/openai
{
  "processors": [
    {
      "inference": {
        "model_id": "openai",
        "input_output": [
            {
                "input_field": "column2",
                "output_field": "text_embedding"
            }
        ]
      }
    }
  ]
}

Steps to reproduce:

  1. create an inference endpoint with the inference API for a 3rd-party service
  2. try to create a pipeline for that inference endpoint through the UI: Search Pipelines -> Add ML Inference Pipeline
  3. be sad

Expected behavior: The UI should let you create either type of inference processor

elasticmachine commented 6 months ago

Pinging @elastic/platform-deployment-management (Team:Deployment Management)

sabarasaba commented 5 months ago

This wasn't really a bug with Ingest Pipelines as we treat the model id simply as a string and we dont do any sort of validation on it besides that. The issue only occurs when creating the pipeline from the Search Pipelines -> Add ML Inference Pipeline UI.

I've created a new issue for adding support for input_output option in ingest pipelines.