Currently at Security Solution we are using three different GenAI connector types, which helps to integrate directly with OpenAI, Gemini and Bedrock.
The long term vision is to migrate from the explicit connectors implementation to the usage of the generic .inferenceconnector, which is integrated with Elasticsearch Inference API. This connector supports multiple LLMs integrations (including OpenAI, Gemini and Bedrock), but due to the big scope of the complete adoption and migration path to .inference connector, the scope was reduced to the MVP integration with the usage of the pre-configured connector for EIS service.
In 8.18 there is a plan to provide Elastic Default LLM experience, which will be exposed within Kibana .inference connector type.
Requirements:
add .inference pre-configured connector to the list of the available connectors and select/use it by default (if the connector selection was not changed)
prohibit the create/edit/delete new .inference connector experience
Currently at Security Solution we are using three different GenAI connector types, which helps to integrate directly with OpenAI, Gemini and Bedrock. The long term vision is to migrate from the explicit connectors implementation to the usage of the generic
.inference
connector, which is integrated with Elasticsearch Inference API. This connector supports multiple LLMs integrations (including OpenAI, Gemini and Bedrock), but due to the big scope of the complete adoption and migration path to.inference
connector, the scope was reduced to the MVP integration with the usage of the pre-configured connector for EIS service.In 8.18 there is a plan to provide Elastic Default LLM experience, which will be exposed within Kibana
.inference
connector type. Requirements:.inference
pre-configured connector to the list of the available connectors and select/use it by default (if the connector selection was not changed).inference
connector experienceChatOpenAI
orBaseChatModelParams
GenAI functionality to cover: