Open cboulay opened 8 years ago
As part of the DS4 changes that I'm pulling up to master each of the trackers computes a "estimated optical pose" of the tracker. Part of the info in this struct is a "projected pixel area". When filtering the position and orientation we take the average pixel area (averaged from all trackers that can see the controller) and turn it into a "tracker quality" value. The min and max pixel areas needed to compute the "tracker quality" for the position and orientation filter are editable in the controller configs. I currently use the tracker quality value to blend between optical position tracking and IMU tracking for the DS4 and for blending the optical orientation (used to fight drift) on the DS4.
This likely isn't the final word on optical quality model, but I figure it's a starting point to experiment with and it helps get the quality rating framework in place.
As I mentioned in #13 , there needs to be a good way to evaluate the quality of the optical tracker position estimation to know whether or not the position should be used to correct the state-space model. We shouldn't implement this until #13 is done because we will use the model to help with quality estimation. Just writing a few thoughts for now...