Closed rtrimana closed 1 year ago
The structures you mentioned are used for Rust-C++ interop, hence the FFI prefix (foreign function interface). The structures used for intercommunication with the device are stored in the crate alvr_sockets in alvr/sockets/src/packets.rs
. Furthermore, a switch to OpenXR is imminent and some of these structures will be reformatted.
Thank you so much for your response @zarik5. I understand that they are used for Rust-C++ interop. I realized that when I looked at the cpp_main.cpp
in the Android app code on the ALVR client side, but in ALVR Release 19, I think the structures were named with something like AlvrDeviceMotion
instead of FfiDeviceMotion
. And, the struct is used to store and send the tracking data (from the headset and hand controllers) as performed in this line by alvr_send_tracking
that is a Rust function.
Do you have any insights on the meaning of the struct members though? I am interested in understanding what they mean. For instance, after searching for a bit, I was guessing that AlvrQuat
, or currently called FfiQuat
, is basically a Quaternion that is used to represent rotation/orientation data in Unity. I wanted to double check if my understanding is correct.
Ok, so:
struct FfiDeviceMotion {
unsigned long long deviceID;
FfiQuat orientation;
float position[3];
float linearVelocity[3];
float angularVelocity[3];
};
deviceID
is the universal identifier for the device. These are defines as constants on the Rust side, calculated from hashing their OpenXR-style paths. These paths can be: /user/head
, /user/hand/left
, /user/hand/right
.orientation
and position
should be self-explanatory, it's the orientation and position in the headset frame of reference (usually the origin is on the floor). linearVelocity
and angularVelocity
are also self-explanatory.struct FfiHandSkeleton {
bool enabled;
FfiQuat boneRotations[19];
};
This structure holds joint rotations for hand tracking. in the OpenXR branch this was replaced by jointRotations[26]
and jointPositions[26]
. Originally the joints were in the Oculus convention, after OpenXR it will be in the SteamVR convention (which changes slightly).
struct FfiFov {
float left;
float right;
float up;
float down;
};
This is the Field of View relative to one view (one eye). left, right, up and down are relative to the center of the screen, ie when you look straight ahead, and are measured in radians (left and bottom are always negative). This is a new concept from VR, where classic games are concerned only with vertical and horizontal FoV.
Thanks so much @zarik5 ! This is really helpful.
@zarik5 One thing I forgot to ask, it looks like, from the README, ALVR supports the Meta Quest Pro (the new headset from Meta). In that case, does it support the new eye tracking and facial expression tracking features then?
Not yet. There is a fork called ALXR that supports them: https://github.com/korejan/ALVR
@zarik5 Thanks. I was also wondering if the ALVR team has any immediate plan in the future to integrate the eye/facial tracking features of Quest Pro into the official version of ALVR as well? And when would that be?
I have plans for eye and facial tracking, but no ETA. The main issue i see for wide adoption is that SteamVR does not have a driver API to submit eye and face tracking information to the game. Some games work around this by implementing their custom API, like VRChat that uses OSC.
@zarik5 Thanks so much for providing all this information. I was wondering about the current Android app code in in this line. Is it intentional that we have a Vec3
with all zero values for both the linear_velocity
and angular_velocity
? I was wondering about that since ALVR v20.0.0
seems to work well even without tracking the velocity information. Or is this something that is currently hardcoded and will be implemented later as part of the context
/ctx
object?
This is correct. OpenXR does not provide velocity information for the head, but also we don't need it. Velocities are needed only for implementing reprojection at the SteamVR level, but the ALVR driver does not implement it, and instead we use the client runtime for reprojection.
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
Hi ALVR Team,
I was wondering if someone in the ALVR team could provide any insights on the data structure used for sending the tracking information from the device. For instance:
FfiDeviceMotion
struct, why do we use an array of 3 elements to storeposition
,linearVelocity
, andangularVelocity
?FfiHandSkeleton
? How do we use the 19 values in theboneRotations
array? What do they represent exactly?left
,right
,up
, anddown
represent inFfiFov
?Thanks so much in advance.