Closed jevillacres closed 4 years ago
Hi Enrique,
With the app, the angle doesn't matter anymore. I worked on a rat reach dataset with 16 points (digits, dorsum of hand, and nose) viewed from the front when the rat is grabbing the pellet (which means at times its viewing from an angle). Hope this answers your question!
Alex
Thank you, for the quick response! What about the MATLAB package, should I expect to run into any issues there?
Yes, the MATLAB version does not work well with different angles, more or less number of points than the preprint (6). I've used it for top-down and bottom up with 6 points outlining the animal and it worked well (snout, front paws/shoulders, hind paws/hips, and tail-base). There's a specific implementation in the app - UMAP (python specific), that handles this sort of high dimensional features better so I do not envision working on a MATLAB equivalent anytime soon.
Thanks for your issue! I'll close it now. Feel free to reopen if this does not solve your question!
Actually, I stand corrected @jevillacres. If you have the 6 points outlining the animal (snout, forepaws, hind paws, and tail-base), the features are still calculated correctly from the side (as long as you order them in snout-forepaw1-forepaw2-hind paw1-hind paw2-tail-base). I've never tried it on side view, but theoretically, it should work. Issue might be that it will group behaviors that the animal faces away from the camera together (obstruction is a feature in B-SOiD).
Hello! Thank you for releasing this program.
I'm interested in using it to classify rat behavior in an open field, however the videos were captured with cameras angled for a side view. I watched your demonstration on youtube to get a better sense on how to implement B-SOiD, and i'm wondering if I should expect compatibility issues with our video's angle, even after teasing out all possible covariant features. Do you have any suggestions?
All the best, Enrique