facebookresearch / OrienterNet

Source Code for Paper "OrienterNet Visual Localization in 2D Public Maps with Neural Matching"
Other
447 stars 44 forks source link

Inquiry regarding the yaw range of images in the MGL dataset #20

Open wangerniuniu opened 1 year ago

wangerniuniu commented 1 year ago

During my usage, I noticed that the yaw range of the images is defined as -360° to 360°, rather than the conventional 0° to 360° range. I am perplexed by this design decision and would like to understand the reasoning behind it.

Could you kindly explain why you chose to define the yaw angle within the range of -360° to 360°? Are there any specific technical or practical requirements that led to this setting? Furthermore, what implications does this variation have on data processing and analysis compared to the traditional 0° to 360° range?

sarlinpe commented 1 year ago

The yaw predicted by OrienterNet should be in [0, 360°]: https://github.com/facebookresearch/OrienterNet/blob/71e89ebdfa489e8c3d28ec668d9dad7da2ca5988/maploc/models/voting.py#L244-L245 GT data may be stored with a different range, but it doesn't matter because it is shifted to the same one when computing the evaluation error: https://github.com/facebookresearch/OrienterNet/blob/71e89ebdfa489e8c3d28ec668d9dad7da2ca5988/maploc/models/metrics.py#L14-L17