HumanSignal / label-studio

Label Studio is a multi-type data labeling and annotation tool with standardized output format
https://labelstud.io
Apache License 2.0
18.94k stars 2.36k forks source link

sampling method not updating when specified in a custom config.py file #546

Closed ghost closed 3 years ago

ghost commented 3 years ago

Describe the bug As I specified the sampling method "prediction-score-min" in my custom config.py, the field "sampling" keeps the value "sequential" in the end. What is strange is that I added in my custom config.py "enable_predictions_button": true which is kept to the final config.py. (so the bug seems to be related to the "sampling" field only)

I used the command : label-studio start my_project --init -c config.json -l config.xml --ml-backends http://ml-backend:9090

custom config.py :

{
  "title": "Label Studio",
  "description": "default",
  "protocol": "http://",
  "host": "0.0.0.0",
  "port": 8080,
  "debug": false,
  "label_config": "config.xml",
  "output_dir": "completions",
  "instruction": "<img src='static/images/ls_logo.png'><br> Type some <b>hypertext</b> for annotators here!<br> <a href='https://labelstud.io/guide/labeling.html'>Read more</a> about the labeling interface.",
  "allow_delete_completions": true,
  "templates_dir": "examples",
  "editor": {
    "debug": false
  },
  "sampling": "prediction-score-min",
  "enable_predictions_button": true,

  "task_page_auto_update_timer": 10000,
  "show_project_links_in_multisession": true,
  "source": {
    "name": "Tasks",
    "type": "tasks-json",
    "path": "tasks.json"
  },
  "target": {
    "name": "Completions",
    "type": "completions-dir",
    "path": "completions"
  }
}

final config.py

{
  "title": "Label Studio",
  "description": "default",
  "protocol": "http://",
  "host": "0.0.0.0",
  "port": 8080,
  "debug": false,
  "label_config": "config.xml",
  "output_dir": "completions",
  "instruction": "<img src='static/images/ls_logo.png'><br> Type some <b>hypertext</b> for annotators here!<br> <a href='https://labelstud.io/guide/labeling.html'>Read more</a> about the labeling interface.",
  "allow_delete_completions": true,
  "templates_dir": "examples",
  "editor": {
    "debug": false
  },
  "sampling": "sequential",
  "enable_predictions_button": true,
  "task_page_auto_update_timer": 10000,
  "show_project_links_in_multisession": true,
  "source": {
    "name": "Tasks",
    "type": "tasks-json",
    "path": "tasks.json"
  },
  "target": {
    "name": "Completions",
    "type": "completions-dir",
    "path": "completions"
  },
  "input_path": "tasks.json",
  "ml_backends": [
    {
      "url": "http://ml-backend:9090",
      "name": "my_projecteb99"
    }
  ]
}

To Reproduce Steps to reproduce the behavior:

  1. create docker container with heartexlabs/label-studio image as base
  2. copy the custom config.py files in the container
  3. specify command label-studio start my_project --init -c config.json

Expected behavior The field "sampling" to keep the value "prediction-score-min"

Environment :

It's my first issue on any open source project so sorry in advance if there is not enough info. Love the project so feel free to ask for help to tackle this problem :)

makseq commented 3 years ago

@AchilleSou Could you check label-studio==0.9.0, we've introduced a new data manager there and now sampling is equal to the ordering by prediction scores column in data manager. Maybe this feature will be helpful for you instead of the old way.

ghost commented 3 years ago

I did that, the new system looks easier to organise the order of the task so thanks !

But I does not seems to take the "score"value from my model predictions result into account...

This is my task after my model prediction on it : image

But the prediction score value stays at 0 in the task table : image

I could not find anything about prediction template change so please tell me if I'm doing something wrong.

makseq commented 3 years ago

@AchilleSou Sorry, it's my fault and bug. Check this PR branch https://github.com/heartexlabs/label-studio/pull/551, it must work.

ghost commented 3 years ago

Sorry for the delayed answer, works like a charm now ! I'm closing the issue, thank you for your time :)