Closed demjened closed 9 months ago
This PR adds support for importing a model with a specific inference_config. This allows us to use regression models with the learning_to_rank inference type that come with externally supplied configuration.
inference_config
learning_to_rank
Example:
MLModel.import_model( es_client, model_id, regressor, feature_names, es_if_exists="replace", es_compress_model_definition=compress_model_definition, inference_config={ "learning_to_rank": { "feature_extractors": [ { "query_extractor": { "feature_name": "title_bm25", "query": {"match": {"title": "{{query_string}}"}}, } }, { "query_extractor": { "feature_name": "imdb_rating", "query": { "script_score": { "query": {"exists": {"field": "imdbRating"}}, "script": {"source": 'return doc["imdbRating"].value;'}, } }, } }, ] } }, )
The change is backward compatible - if inference_config is not passed, it will be set to the default {<model_type>: {}}.
{<model_type>: {}}
This PR adds support for importing a model with a specific
inference_config
. This allows us to use regression models with thelearning_to_rank
inference type that come with externally supplied configuration.Example:
The change is backward compatible - if
inference_config
is not passed, it will be set to the default{<model_type>: {}}
.