Closed weibao1001 closed 1 year ago
Hello, for initial proof of concept tests I had my C "web-server" just streaming the visualization JPEG to the headset as a "website" . Video of this here
This was based on my DIY linux remote desktop web-server Video of the streaming my Linux Desktop to the Quest
This was pretty easy implementation why but needless to say rendering the image on the PC encoding and transmitting it as seen in the first video had delay. Second version used webXR only emitting the pose and handling it effectively using javascript, which despite being moderately easy to write for unfortunately needs https encryption for non-localhost sites so I had to abandon it.
The final step was to modify VrHands application from the Oculus SDK (btw despite the Oculus SDK being aimed for windows I have managed to compile it on Linux, my modified version :P ).
So I have my PC producing BVH frames connected to an ethernet wifi switch which are sent through a TCP/IP socket to the Quest 2 over a wifi connection this is a viable solution delay-wise however it is not very useful for a real application.
I wanted to make some "driver" for an actual application like the VRChat API to have an actual use-case, however due to my research institute having a different focus from VR at the moment and given that large VR companies like Meta are going through a tough phase at the moment, unfortunately I do not have a good out of the box demo/experience for Quest2. I want to have to eventually make something like this at some point :)
Hello, your work is great. I saw you use it in Oculus Quest VR applications in the paper and demo video. I want to know how you calibrate the skeleton and VR together.