XiangGuo1992 / ORCL_VR_EyeTracking

Tobii Eye Tracking for HTC VIVE Eye Pro in Unity
69 stars 9 forks source link

ORCL_Sim - A System Architecture for Studying Bicyclist and Pedestrian Physiological Behavior Through Immersive Virtual Environments

Introduction

This repository comes with the code for Tobii Eye Tracking integrated in HTC VIVE Eye Pro in Unity, Which is a part of projects from Omni-Reality and Cognition Lab in University of Virginia (https://engineering.virginia.edu/omni-reality-and-cognition-lab). More details and visualizations of our projects can be found in (http://uvabrainlab.com/portfolio/mobility-and-infrastructure-design/)

ORCL logo

IMPORTANT UPDATE

Tobii Pro SDK In version 1.9 the VR support was deprecated.https://developer.tobiipro.com/unity/unity-getting-started.html

In order to make everything work, you may download our test scene with compatible Tobii Pro SDK version integrated here. If you choose to use our test scene, please skip the Tobii SDK Pro set up steps (other steps are still necessary).

Prerequisite

  1. HTC VIVE Eye Pro with Tobii Eye Tracking system
  2. Unity version 2018.4.16 or 2018.3.14
  3. Python 3.6.3 (Anaconda version recommended)
  4. SteamVR
  5. Finish the Set up for the HTC VIVE Eye Pro
  6. Tobii Pro SDK for your platform
  7. Set up Eye Tracking Software (SR runtime) if needed

The HTC VIVE Eye Pro hardware (headset, controller) is from HTC VIVE, the integrated eye tracker is from Tobii, they have provided multiple ways to get access to the eye tracking data:

This repository includes sample code and tutorials for Python and Unity API of Tobii Pro SDK only.

Tobii Pro SDK data collection

Website of Tobii Pro SDK: http://developer.tobiipro.com/index.html

You can either use Python API or Unity API to get the eye tracking data.

Python API

Set up Python API as http://developer.tobiipro.com/python/python-getting-started.html.

Then run the TobiiEyeTracking.py in the repository to collect the data externally (not within Unity).

If an eye tracker is successfully found, the data collecting is on going until the key 'q' is pressed (you can also change it to another key in the code). An output .csv data file (name with the start and end time like sample_output) will be exported into the out_dir defined in the code.

output_dir = 'C:/github/ORCL_VR_EyeTracking/Data/EyeTrakcing/TobiiProPython'

Unity SDK (PREFERRED METHODS)

To start with, read the document from Tobii Pro SDK (http://developer.tobiipro.com/unity.html) and download the Tobii Pro SDK for Unity.

  1. Create a new project, or open an existing project, in Unity.

  2. Select Assets > Import Package > Custom Package... from the main menu, or by right-clicking in the Project window.

  3. Browse to the downloaded Tobii Pro SDK, named with TobiiPro.SDK.Unity.Windows.

  4. In the next dialog, select to import all files.

  5. In the project window, Drag and drop the "TobiiPro\VR\Prefabs[VREyeTracker]" prefab into the scene and in the inspector, select 'Subscribe To Gaze'. prefab

  6. (Not required) Drag and drop the "TobiiPro\VR\Prefabs[VRCalibration]" prefab into the scene. Select the [VRCalibration] prefab and in the inspector, select a key to be used to start a calibration.

  7. Drag and drop the "TobiiPro\VR\Prefabs[VRSaveData]" prefab into the scene. Select the [VRSaveData] prefab and in the inspector, select a key to be used to start and stop saving data, select 'Save Data/Save Unity Data/Save Raw Data'.

  8. Save the current project.

  9. Play the scene, the saved XML data can be found in the "Data" folder in the project root. Press the save data key selected earlier to stop and save data.

    More details can be found in the TobiiProVR_readme.txt in this repository.

If a XML data was created without any recorded data, check in the windows system 'Task manager' - 'Services' - 'Tobii Service' to see if it is running or not, try to restart it and collect data again.

2021.05.24 update: The correct experimental order is:

  1. Connect VIVE Pro Eye to your computer(in our case we have wireless connection), open VIVE wireless and SteamVR;
  2. Run room setup, make sure the controllers and headsets are in the right place;
  3. Run eye calibration in HTC VIVE PRO EYE;
  4. Open Unity scenarios;
  5. Start or Restart the SR runtime software right before playing the scene (wait until the small robot icon turns orange as shown in the figure below), this can ensure the data collection is working in case you have empty XML data;

SR runtime status check

  1. Play the scene in Unity;
  2. Stop the scene to see if all the data are collected

Video Recording

Their are two ways to do the video recording: External screen recording or internal Unity Recorder.

Internal Unity Recorder

Different versions of Unity requires different actions.

For Unity 2018, in the asset store of Unity, search for "Unity Recorder", download and import. This is a free library for recording user game.

Unity Recorder

In Unity 2019 or newer, the Unity Recorder can be found in Package Manager. Go to 'Window' - 'Package Manager', click 'advance' - 'show preview packages', find the 'Unity Recorder', install.

image-20200828150410628

After import or install Unity Recorder, select Window > General > Recorder > Recorder Window from the main menu,

After setting the Recorder, press 'START RECORDING', or you may press 'F10' in the keyboard for quick start. The Frame rate is suggested to 24.

Since we have already set for VR eye tracking data saving, the data collecting process will start at the same time. The saved MP4 data can be found in the "Recordings" folder in the project root.

The advantage of this method is that the Unity recorder will record exact every frame in the scene, however, as the frame rate of Unity during game playing is not fixed, but the video has a pre-defined fixed frame rate, it would be difficult to extract the timestamp of the experiment. For example, if the frame rate of Unity Recorder is set to 30Hz, and the actual game frame rate is ~15Hz, then the output video length will be half of the actual length. Up to 2021.1, there has been a valid solution for this issue.

External screen recording

Many software can be used for screen recording, we use OBS studio in our study. In SteamVR, select 'Display VR view', drag and maximize the VR view window to an idle display, then open OBS studio, add this display as a new source, as indicated in the figure below. More settings(canvas size, frequency, file names) can be found in the 'settings' option.obsdemo

The advantage of the this method is that it can integrate different video collection systems(e.g. room cameras) with the same timestamp and frequency as shown with our lab case below.

video_collection

A sample video in youtube about our experiment: Pedestrian crossing using smartphone app

Sample scene

So far, we have already set up everything for data collection. For your convenience, I also upload a sample Unity scene for the whole process, the Google Drive Link to it is here.

Process Eye Tracking Data

Suppose we have XML data collected in "Data" folder in the project root as what I have in the '\Data\EyeTrakcing\TobiiProUnity' folder in the repository, and the videos collected in the "Recordings" folder in the project root as what we have in the '\Data\Video\1.Raw Videos' folder in the repository. The goal of this part is to map the gaze data to the videos.

The three python scripts under 'EyeTrackingProcess' folder provides a workflow of processing eye tracking data.

Citation

If you want to explore more details or find the repo is useful, please cite our work https://www.hindawi.com/journals/jat/2022/2750369/, https://arxiv.org/abs/2202.13468 and https://ascelibrary.org/doi/abs/10.1061/9780784483893.161.

  1. Guo, X., Angulo, A., Robartes, E., Chen, T. D. & Heydarian, A. Orclsim: A system architecture for studying bicyclist and pedestrian physiological behavior through immersive virtual environments. J. Adv. Transp. 2022, 2750369, DOI: 10.1155/2022/2750369 (2022)
  2. Guo, X., Robartes, E., Angulo, A., Chen, T. D., & Heydarian, A. (2021). Benchmarking the use of immersive virtual bike simulators for understanding cyclist behaviors. In Computing in Civil Engineering 2021 (pp. 1319-1326).
  3. Guo, X., Tavakoli, A., Robartes, E., Angulo, A., Chen, T. D., & Heydarian, A. (2022). Roadway Design Matters: Variation in Bicyclists' Psycho-Physiological Responses in Different Urban Roadway Designs. arXiv preprint arXiv:2202.13468.

@Article{Guo2022,
author={Guo, Xiang
and Angulo, Austin
and Robartes, Erin
and Chen, T. Donna
and Heydarian, Arsalan},
title={ORCLSim: A System Architecture for Studying Bicyclist and Pedestrian Physiological Behavior through Immersive Virtual Environments},
journal={Journal of Advanced Transportation},
year={2022},
month={Aug},
day={04},
publisher={Hindawi},
volume={2022},
pages={2750369},
issn={0197-6729},
doi={10.1155/2022/2750369},
url={https://doi.org/10.1155/2022/2750369}
}

@incollection{guo2021benchmarking,
  title={Benchmarking the Use of Immersive Virtual Bike Simulators for Understanding Cyclist Behaviors},
  author={Guo, Xiang and Robartes, Erin and Angulo, Austin and Chen, T Donna and Heydarian, Arsalan},
  booktitle={Computing in Civil Engineering 2021},
  pages={1319--1326}
}

@article{guo2022roadway,
  title={Roadway Design Matters: Variation in Bicyclists' Psycho-Physiological Responses in Different Urban Roadway Designs},
  author={Guo, Xiang and Tavakoli, Arash and Robartes, Erin and Angulo, Austin and Chen, T Donna and Heydarian, Arsalan},
  journal={arXiv preprint arXiv:2202.13468},
  year={2022}
}