opensearch-project / opensearch-learning-to-rank-base

Fork of https://github.com/o19s/elasticsearch-learning-to-rank to work with OpenSearch
Apache License 2.0
14 stars 12 forks source link

[FEATURE] Support remote inference on LTR plugin #27

Open noCharger opened 6 months ago

noCharger commented 6 months ago

Is your feature request related to a problem?

A clear and concise description of what the problem is, e.g. I'm always frustrated when [...].

As a developer, I want to have a large ML model (even LLMs) supported for remote inference purpose.

What solution would you like?

A clear and concise description of what you want to happen.

One direction is to integrate with the ml-common plugin, which already have such capability.

What alternatives have you considered?

A clear and concise description of any alternative solutions or features you've considered.

  1. Native support on LTR plugin to connect remote models directly. (Not just HTTP connections, could be RPC calls too)
  2. A ML node specifically for training and inference purpose within the cluster

Do you have any additional context?

Add any other context or screenshots about the feature request here.

Ref https://github.com/opensearch-project/opensearch-learning-to-rank-base/issues/26 Public doc https://opensearch.org/docs/latest/ml-commons-plugin/remote-models/index/

Trade-off understanding: