czero69 / acomoeye-NN

Acomoeye-NN: NVGaze gaze estimation with upgrades
Apache License 2.0
23 stars 6 forks source link

How to label the ground truth of the collected image? #4

Open ZYX-MLer opened 1 year ago

ZYX-MLer commented 1 year ago

I'd like to build a similar dataset, while, how to label the ground truth of the collected image?

best wishes

czero69 commented 1 year ago

Hey, this is in short how I did it:

  1. I've built my own eye-tracking environment into VR goggles (today you can purchase VR goggles with eye-tracking, mentioning few but not all: Meta Quest Pro; HTC vive with eyetracking; Varjo.

into_dk2 into_dk2_2 my_setup data_coll_app

  1. I have implemented application in UE4 that is rendering points and user is asked to look at these points (after few blinks). Having center point of the eyeball (taken directly from UE layer that is managing VR goggles) and rendered dots I have created GT gaze vectors. The heart component is this piece of code (how to take middle of the eyes and construct gaze vectors from rendered dots):
FVector Loc;
FQuat quat = FQuat::Identity;
GEngine->XRSystem->GetCurrentPose(IXRTrackingSystem::HMDDeviceId, quat, Loc);
/* get player controller and location */ 
auto Controller = UGameplayStatics::GetPlayerController(GetWorld(), 0);
APlayerController* PlayerController = Cast<APlayerController>(Controller);
auto camLocation = PlayerController->PlayerCameraManager->GetCameraLocation();
camRotation = PlayerController->PlayerCameraManager->GetCameraRotation();
cameraLocationLeft = cameraLocationRight = Loc;
cameraRotationLeft = cameraRotationRight = camRotation;
GEngine->XRSystem->GetXRCamera()->CalculateStereoCameraOffset(
    eSSP_LEFT_EYE, cameraRotationLeft, cameraLocationLeft);
GEngine->XRSystem->GetXRCamera()->CalculateStereoCameraOffset(
    eSSP_RIGHT_EYE, cameraRotationRight, cameraLocationRight);
m_cameraPlayerRotationVector = camRotation.Vector();
/* ... in other place in the code, set eye positions. */
m_leftEyePos = cameraLocationLeft - Loc + camLocation;
m_rightEyePos = cameraLocationRight - Loc + camLocation;
/* FRotator arguments are as follows: (y,z,x) in UE coordinates*/
FRotator myRotatorX (FMath::RadiansToDegrees(getGazeY()), FMath::RadiansToDegrees(getGazeX()), 0.0f);
 /* unit vec returned according to choosen x,y gaze angles */
FVector rotatedUnitVec = myRotatorX.RotateVector(m_unitCameraFaceVec);
/* rotated according to currenct camRotation */
rotatedUnitVec = camRotation.RotateVector(rotatedUnitVec); 
/* rotated according to currenct camRotation */
DrawDebugPoint(GetWorld(), m_rightEyePos + 200 * rotatedUnitVec, 3, FColor(250, 0, 0));