-
Hi!
First of all, thank you very much for this awesome work :)
I have been trying to understand in detail the prediction format of the boxes.
in the paper it is said:
![image](https://user-images.…
-
Is it possible to have the logout dialog at this place:
![logoutdlg3](https://github.com/seb3773/Q4OSseb/assets/99963207/fb098b99-c625-4c0e-b44d-6aa28dede4d9)
instead of the center of the screen ? (…
-
Pop growth would depend on fertility rate, marrages(or non-maretal activities), and healthcare.
All fertility calculations are done on female pops.
Fertility would be affected by stuff like access/u…
-
Hi,
very excited I read your paper. I would like to ask if there is any chance that you can provide the trained model for the MVExoNet. I'm a student who works on action recognition and your MVExoN…
-
Hello,
Thank you for providing the code to generate the amazing plots!
I want to plot egocentric rate maps with a fix object as the reference and use the animal's head direction defined from …
-
Thank you for your great work.
Thanks to your impressive work, I'm currently working on converting the images from the AGD20k dataset into depth maps and subsequently into point clouds. Would it be…
-
(Arxiv 24.09.05) MADiff: Motion-Aware Mamba Diffusion Models for Hand Trajectory Prediction on Egocentric Videos [Paper](https://arxiv.org/abs/2409.02638)
probably in the "diffusion" category. Than…
-
Hello,
Thank you for open-sourcing this amazing project!
I have a question about the convention for the transformation of the 3D box. EgoNet only produces an egocentric pose (i.e. camera coordinat…
-
Dear EGOEXO4D team,
Thanks for releasing the dataset and challenges.
After reading the documentation and playing with the code, I still find some confusing points.
1. EgoPose is annotated with fr…
-
Recently, I wanted to add a functionality to the `play` plugin, which would allow the user (aka egocentric me) to do something like `beet play --template awesome` in order to play preconfigured querie…