Open JoeSchmeltzer opened 6 years ago
I suspect that the complexity of the calculations required for GPS and the dynamic nature of the things that have to be measured and taken into account result in the calculated location not being a precise point, but rather a range of points. Therefore the GPS information is returned as a circle where the center is given as the latitude and longitude and the radius is given as the accuracy. GPS works in 3 dimensions therefore it gives you a horizontal accuracy and a vertical accuracy. (Horizontal and vertical accuracies will most likely be different.) That is my understanding of it anyway.
When you set the DesiredAccuracy, you are just telling the system "I want my location to be this accurate." The system will try to do what you want, but it's not guaranteed. When you receive a location, you can test its accuracy to see whether you can use it. Locations with better accuracy values can take longer to acquire.
I would like to better understand what the Accuracy field represents. Ideally, I'd like to know when my reading is accurate and only save those values, but I keep getting pretty accurate GPS readings (14 meters from my actual location), but the Accuracy field has consistently been over 10,000 (12019, in my latest test). So is this Accuracy property merely saying that it is guaranteeing that the coordinates returned are accurate give or take 12,019 meters?
And a follow-up: how does this relate to DesiredAccuracy? I have been setting my DesiredAccuracy to 2000 (because that's good enough for my purpose), but I'm still getting the Accuracy values mentioned above.
Thank you.