Closed chishui closed 3 weeks ago
Thanks, @chishui! Is this being released in 2.15?
Thanks, @chishui! Is this being released in 2.15?
Yes, sorry that I forgot to mention the version
Hi @chishui , what is the ETA for this doc PR? The first release candidate is today. Thanks!
Hi @chishui could you please provide a PR for this since it is labeled for 2.15 and we are approaching the release date?
Thanks.
Since there has been an active discussion on deprecating the parameter we introduced (https://github.com/opensearch-project/OpenSearch/issues/14283), whether we should announce the feature is still pending decision.
Hi @chishui please give a decision on this next Monday/Tuesday. As we need to stay on the timeline.
If you have concerns including in 2.15.0 I would suggest removing the 2.15.0 label until resurface. That way, this would not be a blocker for 2.15.0 anymore.
Thanks.
Hi @chishui, I removed the 2.15 label. If something changes, we'll be happy to help get the content updated.
PR is raised, can we add the 2.15 label back? Thanks
What do you want to do?
Tell us about your request. Provide a summary of the request and all versions that are affected. We added this batch ingestion feature, user can now use it to accelerate their ingestion time if the ingest pipeline has ML processors
text_embedding
andsparse_encoding
which connects to external ML servers.The documentation changes would include a new parameter in
_bulk
API, a new paragraph in ingest processor to explain the batch enabled processors, and a new page to introduce this feature.What other resources are available? Provide links to related issues, POCs, steps for testing, etc.