Open Vesper0704 opened 3 years ago
Hello, to deploy it, you need to have the Unity VR scenario ready, or you can just download the sample scene in the document. If you are choosing Unity SDK(preferred way), after setting everything up, all you need to do is just running the scene in Unity by hitting the start button as you normally would do, the saved XML data can be found in the "Data" folder in the project root directory . If you are using python API, you need to run the python program outside Unity first, then run the Unity scene.
Hello, to deploy it, you need to have the Unity VR scenario ready, or you can just download the sample scene in the document. If you are choosing Unity SDK(preferred way), after setting everything up, all you need to do is just running the scene in Unity by hitting the start button as you normally would do, the saved XML data can be found in the "Data" folder in the project root directory . If you are using python API, you need to run the python program outside Unity first, then run the Unity scene.
Thanks a lot for your answer! But I am not quite clear about the Unity part. I have connected and set up the eye pro device and could watch 360 videos already on PC via HMD and all I want to do is just collecting gaze data, what is the point of running the Unity Scene instead of the python script only? Sorry but I am really a rookie in this field, so desperately looking forward to your reply! (p.s. The .csv data in Data/EyeTrakcing/TobiiProPython seems to be confusing to me, why most of them are nan but not meaningful statistics? ) Much grateful for your patience and time!
Hello, to deploy it, you need to have the Unity VR scenario ready, or you can just download the sample scene in the document. If you are choosing Unity SDK(preferred way), after setting everything up, all you need to do is just running the scene in Unity by hitting the start button as you normally would do, the saved XML data can be found in the "Data" folder in the project root directory . If you are using python API, you need to run the python program outside Unity first, then run the Unity scene.
Thanks a lot for your answer! But I am not quite clear about the Unity part. I have connected and set up the eye pro device and could watch 360 videos already on PC via HMD and all I want to do is just collecting gaze data, what is the point of running the Unity Scene instead of the python script only? Sorry but I am really a rookie in this field, so desperately looking forward to your reply! (p.s. The .csv data in Data/EyeTrakcing/TobiiProPython seems to be confusing to me, why most of them are nan but not meaningful statistics? ) Much grateful for your patience and time!
There are two APIs: Python and Unity, the reason why I'm using Unity is that our VR scene is built in Unity, using Unity SDK can avoid addtional data collection steps, and get the same timestamps as we have some other data collection scripts in Unity(such as the bike sensor).
In your case, if I understand correctly, you don't have to use Unity at all. So just use python API. I have a few solutions for you to have a try:
Thanks for your sharing! Now I am using HTC vive eye pro and want to retrieve data in real-time watching VR videos( connected to Windows and based on steamVR platform). Would you mind telling me how to run the program - on Windows or integrate it into the HMD? and what if I only choose python as the language only?