Open brianwu568 opened 1 year ago
My understanding is that you only have 2 nodes (so only 2 anchors) in your setup and no other sensor attached to the Crazyflie. If it is so, this is not enough to achieve 3D position estimation, which explains the drift.
Both with TWR and TDoA, 2 anchors will only allow you to measure 2 dimensions, you are missing one. In the case of TWR for example, you will only know that you are on the intersection of 2 spheres, so a circle. There is no way to know where on this circle you are so the drift in the IMU will make the estimator move you on this circle.
You need more data, more anchors for sure, 3 is the minimum with TWR. However you could also add the Z-ranger or the flow deck on the Crazyflie instead of more anchors. For example we have been experimenting with 4 anchors on the ceiling plus the Z-ranger to more accurately sense the Z position.
Hello developer team,
I hope this message finds you well. I'm writing today to report certain issues while attempting to fly with the Crazyflie 2.1 as well as two loco positioning nodes. The only deck that is attached to the Crazyflie 2.1 is the loco positioning deck.
I have configured the two nodes with the latest firmware, and did two tests, in which they were set as TWR nodes as well as TDoA2 nodes. I then launched cfclient, connected the CrazyFlie, and navigated to the loco positioning monitor.
It seems that with TDoA2, there is a large amount of sensor drift given that the two nodes were stationary, placed over 2m apart, and the drone was sitting stationary on a table in full field of view of the two nodes as well. Additionally, when switched to TWR nodes in the same setup, less sensor drift was observed; though there was still significant sensor drift as the drone was moving around on the monitor despite being placed stationary again. Additionally, the position updates and refresh rate on the client was significantly slower than demo videos involving the LPS that I have seen.
Would both of these aforementioned problems be solved by switching to a setup with more nodes (i.e. 6 or 8)? I'm trying to get the drone to fly in a certain pattern around a specific location in a room. Since we are not using the camera on the AI deck for tracking at the moment, the method of tracking the user position would have to be with a different sensor. We are considering triangulating the altitude from the ToF sensor with distance from user determined via the LPS, but we're not sure that would enable us to obtain the cardinal direction of the target from the user.
Any insights on this would be deeply appreciated!