Point correspondence-based display calibration and ArUco marker tracking for the HoloLens 2.
This is the accompanying GitHub repo for our paper investigating the perceptual accuracy of manual tasks using an optical see-through head-mounted display (OST-HMD) for guidance:
Folder setup for the project sample.
data/
... GroundTruths/
... MarkerConfig/
... SampleResults/
... ... Figures/
... ... Matlab/
... ... Traces/
... ... Truths/
... TraceTargets/
... TraceTemplate/
unity-sandbox/
... CustomArUcoBoards/
... HoloLens2-Display-Calibration/
... OpenCVRuntimeComponent/
We have included the ArUco board tracking configuration that we used in our study, along with the calibration marker used for per-user display calibration. There are several test cases of scanned user-trace results available in the data/SampleResults/Traces
folder. Additionally, we have included the relevant Matlab scripts used for processing the user-trace data and computing relevant metrics of accuracy.
To be able to use this repo for tracking and display calibration, we need to first print off the required materials.
data/TraceTemplate/
folder, open the trace_template.pdf
document (or data/MarkerConfig/
and the board_config.pdf
document) and print at 100% scale (actual document size). Attach/tape down the trace template onto a flat object to hold it in place during tracking.data/MarkerConfig/
folder, open the calibration_marker_config.pdf
document and print at 100% scale. Create a calibration marker as shown below using cardboard and a handle.You now have the materials prepared for calibration and tracking.
CustomArUcoBoards
solution in Visual StudioOpenCV.Windows.3411.0.0.nupkg
NuGet package to the CustomArUcoBoards
project
Install-Package ..\OpenCV.Windows.3411.0.0.nupkg -ProjectName CustomArUcoBoards
CustomArUcoBoards
is a sandbox environment set up to test the performance of custom ArUco board object trackingx64
or x86
), a window will appear showing the created ArUco markers (which will be saved as images), press escape to close each windowdata/MarkerConfig/board_config.pdf
for configuring the ArUco tracking board objectVideoMediaFrame.CameraIntrinsics.UndistortedProjectionTransform
utility, though you are welcome to try a standard camera calibration procedure to see if your accuracy improves. There are more details of how to do this in another repo of mine.OpenCVRuntimeComponent
solution in Visual StudioOpenCV.HoloLens.3411.0.0.nupkg
NuGet package to the OpenCVRuntimeComponent
project
Install-Package ..\OpenCV.HoloLens.3411.0.0.nupkg -ProjectName OpenCVRuntimeComponent
OpenCVRuntimeComponent
takes video frames from the C# Unity project and performs the tasks of marker tracking using the ArUco library and rigid transform estimation for display calibration .winmd
and .dll
files which are used for the HoloLens 2 app
OpenCVRuntimeComponent/ARM64/(Release/Debug)/OpenCVRuntimeComponent/
unity-sandbox/HoloLens2-Display-Calibration/Assets/Plugins/ARM64/
folderOpenCVRuntimeComponent
was built from source, copy .winmd
, .dll
and .lib
files from OpenCVRuntimeComponent/ARM64/(Release/Debug)/OpenCVRuntimeComponent/
to the Unity plugins directory unity-sandbox/HoloLens2-Display-Calibration/Assets/Plugins/ARM64/
folderOpen HoloLens2-Display-Calibration
in Unity. There are several different scenes available for selection in the Assets/Scenes/
folder, as detailed in our paper, these will determine which tracing paradigm (adjacent, direct, or calibrated) will be used for the experiment.
hmd adjacent
: There is no calibration procedure required, when looking at the printed trace template, you will see a virtual model augmented adjacent to the tracking target.
hmd calib
: There is an additional calibration procedure required, with 10 point correspondences collected per eye. For the point collection process you need to align the tracked marker corners with the virtual on-screen reticle and press spacebar on a bluetooth or USB keyboard connected to the HoloLens 2 (or use the double-tap gesture) to collect that point correspondence. Virtual marker positions will change slightly across each calibration point correspondence. Begin calibration with your right eye (left eye closed), next calibrate left eye (right eye closed).
After completing calibration, set down the calibration object and look at the printed trace template. You should now see a virtual model augmented directly on the tracking target.
hmd
: There is no additional calibration required, when looking at the printed trace template, you will see a virtual model augmented directly on the tracking target.
There are a number of settings which can be adjusted on the ScriptHolder
gameobject in Unity, including:
896x504
by default) 6x6_250
by default)markers
by default) and after calibration (custom board
by default, can also select markers
)per frame
provided intrinsics are default)After selecting the desired tracking scene, we can now build the scene to the device.
Universal Windows Platform
, select HoloLens
for target device, and ARM64
as the target platform.appx
file to the HoloLens 2Evaluation of the user trace performance relative to ground truth points and contours was performed in Matlab.
data/SampleResults/Matlab
folder, open the Main.m
scriptdata/Traces/calib_gt_02.jpg
) to one of the included test samples
Traces/calib_gt_02.jpg
uses Truths/GT_02.jpg
as a ground truth template)
If you found this code repo useful, please consider citing the associated publication:
@article{jimaging8020033,
author={Doughty, Mitchell and Ghugre, Nilesh R.},
title={Head-Mounted Display-Based Augmented Reality for Image-Guided Media Delivery to the Heart: A Preliminary Investigation of Perceptual Accuracy},
journal={Journal of Imaging},
volume={8},
year={2022},
number={2},
article-number={33},
url={https://www.mdpi.com/2313-433X/8/2/33},
issn={2313-433X},
doi = {10.3390/jimaging8020033}
}