HumanSignal / label-studio

Label Studio is a multi-type data labeling and annotation tool with standardized output format
https://labelstud.io
Apache License 2.0
18.96k stars 2.36k forks source link

[Warning] Batch processing not supported switched to one-by-one task retrieval - ML Backend #1755

Open krnithishkumar opened 2 years ago

krnithishkumar commented 2 years ago

Describe the bug After integrating the Custom ML backend with the Label-Studio and while during the annotation on the particular file. it throws up the warning [ml.models::predict_tasks::183] [WARNING] 'ML backend 'Staged Model' doesn't support batch processing of tasks, switched to one-by-one task retrieval and the prediction call is getting twice for the same file.

Need to know why it occurs two times at the backend and it has to be fixed with the single call to the backend.

This even occurs for the simple text classifier in the provided example.

Parent ticket: https://github.com/heartexlabs/label-studio-ml-backend/issues/64

To Reproduce Steps to reproduce the behavior:

  1. Initialize and set up an ML backend for getting auto-prediction. It will be served with a particular host.
  2. For a label-studio project, set up the ML and add a model with the host IP.
  3. Select this preference in Machine Learning (Show predictions to annotators in the Label Stream and Quick View)
  4. Upload a new file and click the file for the annotation.
  5. Check for the logs, you can notice that the following warning occurs. [2021-11-18 05:22:20,285] [ml.models::predict_tasks::183] [WARNING] 'ML backend 'Staged Model' doesn't support batch processing of tasks, switched to one-by-one task retrieval

Expected behavior It is expected to take a single call for the prediction.

Screenshots

  1. At the Custom-ML Backend Server image
  2. Logs at the Label-Studio tempsnip

Environment (please complete the following information):

makseq commented 2 years ago

https://github.com/heartexlabs/label-studio-ml-backend/issues/64