We are often training multiple models that search for specific things in an image. When found, they are passed to other models of their corresponding category.
For example :
First model's job is to find the "big" objects and then pass it to the right second model specificly trainned for this big image class to find the "smaller-parts". This is a 2 level nesting, but we should be able to add more!
I didn't find an annotation format / tool that supports that yet. Also, a change to "parent" object should be changing (or not) the children's position following the user defined strategy. For example, resizing "left" side of a bounding box for -15px should translate all the children positions for -15px if the nested change strategy is telling to do so.
We are often training multiple models that search for specific things in an image. When found, they are passed to other models of their corresponding category.
For example : First model's job is to find the "big" objects and then pass it to the right second model specificly trainned for this big image class to find the "smaller-parts". This is a 2 level nesting, but we should be able to add more!
I didn't find an annotation format / tool that supports that yet. Also, a change to "parent" object should be changing (or not) the children's position following the user defined strategy. For example, resizing "left" side of a bounding box for -15px should translate all the children positions for -15px if the nested change strategy is telling to do so.
Here is a basic example of nested annotations :