Next step is to implement the Kalman filter in Java and possibly create multiple KFs running at once with different sensors. When one KF diverges from the rest it can tell us certain things about the robot. For example, we could tell if we're being pushed sideways by fusing together vision + accelerometer. Another thing to test is fusing in accelerometer readings into the primary system, but that requires more data that's not currently available. So we'll have to make do with what we have. All of these should be supported by hardware checks. If encoders start giving 0, give up, don't ram the robot into a wall.
the ultimate test of how easily we can integrate our subsystems (vibe check FAILED)
removed outdated robot system, too theoretical
TX now considers rotation of turret in calculation
added more values into config to describe structure and physics of robot
NavigationSubsystem now needs a VisionSubsystem so it can see the target for the KF
drive now provides estimates of voltages across motors in order to simulate behavior. These measurements are probably not super accurate. 971 integrates voltage as a state in their KFs and estimates it too, might be worth looking into
Next step is to implement the Kalman filter in Java and possibly create multiple KFs running at once with different sensors. When one KF diverges from the rest it can tell us certain things about the robot. For example, we could tell if we're being pushed sideways by fusing together vision + accelerometer. Another thing to test is fusing in accelerometer readings into the primary system, but that requires more data that's not currently available. So we'll have to make do with what we have. All of these should be supported by hardware checks. If encoders start giving 0, give up, don't ram the robot into a wall.