bitnami / containers

Bitnami container images
https://bitnami.com
Other
3.43k stars 4.89k forks source link

[bitnami/elasticsearch] No processor type exists with name [inference] #72437

Open alimoezzi opened 2 months ago

alimoezzi commented 2 months ago

Name and Version

bitnami/elasticsearch:8.15.0

What architecture are you using?

amd64

What steps will reproduce the bug?

Inference processor although it's available in kibana but upon creating an ingest pipeline with inference processor results in the following error:

No processor type exists with name [inference] 

image

This processor works with no problem with docker.elastic.co/elasticsearch/elasticsearch:8.15.0

What is the expected behavior?

Bitnami build should include this processor as the official image also contains it both in kibana and elasticsearch.

What do you see instead?

Kibana includes the option but upon saving this error is raised:

No processor type exists with name [inference]

Additional information

Checked 8.15.1 and issue still persists.

andresbono commented 1 month ago

Hi @alimoezzi, thank you for opening this issue. Can you describe and share how are you running the elasticsearch stack? Are you using the Helm Chart? Do you run the containers directly? Please, give us some more details so we can try to reproduce the issue on our side.

alimoezzi commented 1 month ago

I have tested both containers and helm chart and none of them works.

alimoezzi commented 1 month ago

To reproduce the issue, you need to create a pipeline with inference Processor. Also, this processor is missing in the cluster info.

alimoezzi commented 1 month ago

I think it is related to disabled ML by default in both container and helm chart. I tried to enable it but then the cluster doesn't turn on. In this case I think for sure Bitnami build is the cause as official build works with no problem.

gongomgra commented 1 month ago

Hi @alimoezzi,

Sorry for the delay on getting back to you. Can you provide us with more information on how to reproduce the issue and how to enable the ML features so we can test it on our side? According to the official docs linked below you have to download and install your own ML model into the stack first and then create the inference type processor using the installed model.

https://www.elastic.co/guide/en/elasticsearch/reference/current/ingest-pipeline-search-inference.html

Did you find any error following those steps? I have found the "Inference" processor type under the "Stack Management" > "Ingest pipelines" > "Create pipeline" > "Add a processor" menu

Screenshot 2024-10-11 at 17 07 29

alimoezzi commented 1 month ago

I have found the "Inference" processor type under the "Stack Management" > "Ingest pipelines" > "Create pipeline" > "Add a processor" menu

Exactly, at this stage continue by clicking 'Add Processor' and then create the pipeline! Then you will get the same error.

alimoezzi commented 1 month ago

This feature is not enabled by adding a model. Instead by enabling feature flag elasticsearch_conf_set xpack.ml.enabled "true" as it has been disable by Bitnami by default without any notice.

gongomgra commented 3 weeks ago

@alimoezzi thanks for the detailed information. I have found a related configuration in our initialization logic

https://github.com/bitnami/containers/blob/2de4377adaaa5ba0ba682bc1aa62747cea63bc22/bitnami/elasticsearch/7/debian-12/rootfs/opt/bitnami/scripts/libelasticsearch.sh#L753

According to it, we found some issues while enabling it by default in some scenarios, but you can customize the setting on your side via an initialization script

https://github.com/bitnami/containers/tree/main/bitnami/elasticsearch#initializing-a-new-instance

alimoezzi commented 3 weeks ago

The bigger problem is bitnami build has corrupted xpack-ml. Please enable the option and you see the instance throws error.

gongomgra commented 3 weeks ago

@alimoezzi thanks for your message. I have created an internal task to further investigate this issue. We will keep you posted.