Open nikunjsanghai opened 4 weeks ago
Yes, the ordering is correct. The instructions for running the calibration and capture are in the README:
@loolirer can give you more details if something remains unclear.
@nikunjsanghai is your issue solved?
@debOliveira thank you for the support. Both @nikunjsanghai and I are attempting to recreate the results in your conference paper and so far things are going smoothly. We are waiting for our IR LED lights to arrive this week, and we will let you know if any issues arise. If you are curious, we are putting together the following experimental rig to evaluate a SLAM algorithm I have developed. As shown, we are using three Module 2 NoIR cameras fixed to a 3D-printed plate, and we aim to track a drone ($F_R$) with respect to the camera cluster ($F_C$) to (ultimately) resolve frame $F_J$ in the world coordinate frame ($F_W$).
Hey @adthoms,
That is excellent work! Please keep me posted when the paper comes out; we may apply it. Contact us if issues arise or if you want to brainstorm some ideas.
@loolirer developed a digital twin for our arena to facilitate prototyping. Maybe it is useful for some of your visualizations.
Is your feature request related to a problem? Please describe. No.
Describe the solution you'd like From the existing documentation, it is not entirely clear how to operationalize the entire motion capture system. From our understanding, there is central script
mocaprasp.py
, and from here would call the following functions in order:Describe alternatives you've considered NA
Additional context NA