Open bbernhard opened 6 years ago
this would definitely be useful (e.g. looking for part-annotated images to finish off), and a useful way to take a look at general progresss.
regarding details (ear nose ete..), perhaps displaying both the area, and the individual annotation outlines (but fainter/thinner than usual) would give an indication of the current granuality of annotation
short update:
I am currently working on a few extensions to the query language, which makes it possible to filter results depending on the annotation coverage and the image size. Not yet sure about the final syntax, but to give some impression, here a few examples:
Some examples:
annotation.coverage < 50%
annotation.coverage >= 10% & annotation.coverage <= 50%
image.width > 100px
image.height > 100px
Of course they can also be combined:
dog & annotation.coverage > 10% & image.width > 100px
There are still a few minor things left, until I can push it to production, but I did some initial testing on the production system today and it's looking pretty good so far. (I've to admit that I underestimated that feature a bit...was way more complicated than I anticipated it to be). Hope that it can go live in the next days.
regarding the label graph: I finally found a good graphing library that can deal with thousands of nodes + links. While the old DOM based implementation takes more than 30sec until @dobkeratops's label graph is layed out (including a Firefox going nuts for several seconds), the new canvas based approach takes roughly one second. It's like day and night. There's still a bit left to do on the implementation side, but once it's finished I think it will be pretty cool. :)
That would be awesome! I can definitely imagine this being useful - being able to seperate images that focus on something, vs labels of incidental objects.
I suppose if you've got that kind of data lying around you could also use it in search criteria, although by default random order is fine.
i look forward to seeing the new label graph too
@dobkeratops
ok, the changes are now live. It's now possible to query for annotation.coverage
, image.width
and image.height
. Currently, those query options are only available in the browse annotation and browse label view and not in the browse annotation refinement or the export view.
If you want to give it a try, here are a few sample queries:
Current limitations:
px
, %
) the query parser won't accept the query.image.width = 500%
or annotation.coverage > 50%
is syntactically correct, you get an error. workaround: just combine it with a label, e.q: ~tree & annotation.coverage > 50%
. (this only affects the browse annotation view; in the browse label view it should work even without a label). Besides the query extensions, the new label graph is now also live. Feel free to check it out and tell me what you think.
edit: in case you are wondering: in the label graph view there is the option to "highlight all existing nodes". At the moment, only a handful of labels get colored, although there already exist much more. That's not a bug, I was just too lazy to go through all the labels in the label graph to mark them. ;) My main intention was just to show a little proof of concept of an idea I had...wasn't sure if that's useful.
I know, all that is still pretty rough and not very polished, but I hope that at least some of those things are useful and worth pursuing :)
edit#2: I also changed something regarding the autocomplete. A autocomplete suggestion will now only be shown after at least three characters are entered and after a delay of 700ms. I hope that this will fix the browser freeze without having a negative impact on UIX. Please let me know, in case the problem still persists, @dobkeratops
awesome.. the browser freeze is strangely intermittent (sometimes its ok) - what you describe sounds like a great compromise. I look forward to trying out the rest. Thanks for the other recent labels incidentally.. things like steps & gate, great to be able to do those building details
While working on #188 another idea came to my mind. I think it would be kinda cool, if you could browse images by the percentage that's already annotated (e.q:
annotated< 50%
-> show only those images where the total summed up area of all polyons cover less than half of the image's area`.I think this could be a pretty interesting way to query the dataset. It might not work for all type of images (e.q: imagine we have a picture like this: https://images.pexels.com/photos/220453/pexels-photo-220453.jpeg?auto=compress&cs=tinysrgb&h=750&w=1260 If there is now a bounding rect drawn around the person, the algorithm would detect that there is almost 100% of the picture annotated, altough there are plenty of details left (
ear
,nose
,eye
...)), but I think it could at least work for crowded scenes with a lot of details.