Describe the bug:
Inference endpoint does not seem to get created during KB setup, even though ELSER is deployed. This affects the workflow of creating a custom KB index entry via PDF upload. The PDF upload workflow asks for an Inference Service value, which should be elser_2 after KB setup.
Kibana/Elasticsearch Stack version:
8.16 BC1
Steps to reproduce:
Go to Stack Management -> AI Assistants -> Security AI Assistant -> Knowledge Base on a brand new Cloud Deployment.
Click on "Setup knowledge base"
Navigate to Home -> Upload file
Follow the "Advanced" steps and then click "Add additional field" -> "Add semantic text field"
Notice that there is and Inference Service dropdown but it's not populated by any values. It should be populated with the ELSER model.
Describe the bug: Inference endpoint does not seem to get created during KB setup, even though ELSER is deployed. This affects the workflow of creating a custom KB index entry via PDF upload. The PDF upload workflow asks for an Inference Service value, which should be
elser_2
after KB setup.Kibana/Elasticsearch Stack version: 8.16 BC1
Steps to reproduce:
Screenshots (if relevant):