umyelab / LabGym

Quantify user-defined behaviors.
GNU General Public License v3.0
69 stars 7 forks source link

Environmental Features Labelling #33

Closed WttBe closed 9 months ago

WttBe commented 1 year ago

Hello,

I was wondering if you planned on improving the software with the possibility to label environmental features, such as different zones or compartments? I guess that for environmental changes, we could encode that as specific behaviours?

Thanks in advance,

WB

yujiahu415 commented 1 year ago

Hi, I think the current version of LabGym can already do this. 1. use LabGym 'Generate image examples' to extract some frames of your video; 2. use Roboflow (https://roboflow.com) or any online annotation tools to annotate the outlines of the 'environmental features' (select COCO instance segmentation format); 3 input the annotated images in 'Train Detectors' to train a Detector in LabGym; 4. in 'Generate behavior examples', use the trained Detector to generate some 'behavior' examples and sort them; 5. train a Categorizer in 'Train Categorizers' to identify the 'behaviors'.

Let me know if I understand your questions correctly / if this is what you wanted to do. The current version only support the detection of one category of the 'environmental feature' but in the next update (should be done in a couple of days), LabGym will support the detection of multiple categories of objects of your interest.

WttBe commented 1 year ago

Hi,

Thanks a lot for your answer! The use of Roboflow would work for feature labelling, indeed. However, it seems harder to make it work for identifying different areas in the same experimental setting, while also being able to locate the individuals over these different zones. Also, but I might have missed this point in the paper, it would be interesting to be able to identify towards which area an individual is watching at a specific event.

Best regards,

WB

yujiahu415 commented 1 year ago

Sorry that it seems I didn't get what you meant previously. So is it something like labeling certain areas and see whether the animal goes there / whether it is watching towards (or facing) there what you wanted to identify? If so, the current LabGym may have difficulty doing this but 2.0 should be able to. 2.0 will enable you to label different kind of objects / animals and identify the interactive behaviors among them. So I guess the different areas can be labeled as different 'objects' in this case. 2.0 will be released in early July.

WttBe commented 1 year ago

No problem, I should have been clearer!

This is exactly it! July is around the corner, we'll be there in no time. I'll wait until version 2.

Best regards,

yujiahu415 commented 1 year ago

Hi,

Just so you know. LabGym2 is out. It now can analyze complex interactive behaviors.

WttBe commented 1 year ago

Hi,

Thanks, I'll take a look in the next few days!

Cheers,