Open alimoezzi opened 2 months ago
Hi @alimoezzi, thank you for opening this issue. Can you describe and share how are you running the elasticsearch stack? Are you using the Helm Chart? Do you run the containers directly? Please, give us some more details so we can try to reproduce the issue on our side.
I have tested both containers and helm chart and none of them works.
To reproduce the issue, you need to create a pipeline with inference Processor. Also, this processor is missing in the cluster info.
I think it is related to disabled ML by default in both container and helm chart. I tried to enable it but then the cluster doesn't turn on. In this case I think for sure Bitnami build is the cause as official build works with no problem.
Hi @alimoezzi,
Sorry for the delay on getting back to you. Can you provide us with more information on how to reproduce the issue and how to enable the ML features so we can test it on our side? According to the official docs linked below you have to download and install your own ML model into the stack first and then create the inference type processor using the installed model.
Did you find any error following those steps? I have found the "Inference" processor type under the "Stack Management" > "Ingest pipelines" > "Create pipeline" > "Add a processor" menu
I have found the "Inference" processor type under the "Stack Management" > "Ingest pipelines" > "Create pipeline" > "Add a processor" menu
Exactly, at this stage continue by clicking 'Add Processor' and then create the pipeline! Then you will get the same error.
This feature is not enabled by adding a model. Instead by enabling feature flag elasticsearch_conf_set xpack.ml.enabled "true"
as it has been disable by Bitnami by default without any notice.
@alimoezzi thanks for the detailed information. I have found a related configuration in our initialization logic
According to it, we found some issues while enabling it by default in some scenarios, but you can customize the setting on your side via an initialization script
https://github.com/bitnami/containers/tree/main/bitnami/elasticsearch#initializing-a-new-instance
The bigger problem is bitnami build has corrupted xpack-ml. Please enable the option and you see the instance throws error.
@alimoezzi thanks for your message. I have created an internal task to further investigate this issue. We will keep you posted.
Name and Version
bitnami/elasticsearch:8.15.0
What architecture are you using?
amd64
What steps will reproduce the bug?
Inference processor although it's available in kibana but upon creating an ingest pipeline with inference processor results in the following error:
This processor works with no problem with docker.elastic.co/elasticsearch/elasticsearch:8.15.0
What is the expected behavior?
Bitnami build should include this processor as the official image also contains it both in kibana and elasticsearch.
What do you see instead?
Kibana includes the option but upon saving this error is raised:
Additional information
Checked 8.15.1 and issue still persists.