HipsterSloth / PSVRTracker

A sample app demonstrating position and orientation tracking for the PSVR headset
MIT License
37 stars 12 forks source link

Is this supposed to be merged back into psmoveservice? #4

Closed opendata26 closed 5 years ago

opendata26 commented 5 years ago

Just came across this project while looking into psmoveservice with psvr, while looking through commits of I saw one that said you will be adding psmove controller. Is this just for config tool? Or will you be adding psmove controller support then merging back?

Also, as a side note, how hard would it be to get the psmove steam bridge working with this? EDIT: oh wow, this looks really easy to port psmovefreepiebridge to, api is basically the same. Will have a go tomorrow. Kinda surprised no one has done before :)

HipsterSloth commented 5 years ago

I wouldn't port this over to psmove steamvr bridge just yet. The controller support is still under development as is the multi-camera tracking. As I get relevant features completed I will port them back over to psmove service. For example, the new multithreaded filter work I started here was ported over to psmoveservice for the zcm2 controller release.

The long term goal for this library is integrate it into https://github.com/alatnet/OpenPSVR as a steamvr driver. Before I do that though I need to finish multi-camera tracking.

opendata26 commented 5 years ago

Ah, thanks for the info, managed to get my hacked up freepie bridge working okay. Can you give any pointers on where to look for multiple camera tracking?

HipsterSloth commented 5 years ago

I assume you mean "multiple camera tracking" for the PSVR headset? The best best for the moment would be to use Trinus + FreePIE + PSMoveService. See tutorial here:

https://github.com/cboulay/PSMoveService/wiki/PSVR-Positional-Tracking-setup

You'll need to attach a glowing ping pong ball to the PSVR headset as outlined here:

https://github.com/HipsterSloth/PSMoveSteamVRBridge/wiki/Virtual-Controller-Setup

opendata26 commented 5 years ago

Yeah, I know, I'm meaning code wise how would I go about adding support?

HipsterSloth commented 5 years ago

Do you mean adding code support for multiple camera tracking in PSVRService or something else?

If you mean what code you would have to modify in PSVRService, it's basically just finishing getting the calibration mat utility ported over from psmoveservice. It's mostly there. Lots of debugging too. I'm in the middle of tracking down an issue with the new ps3 eye code not setting the gain and exposure correctly.

opendata26 commented 5 years ago

Adding code, ah, I will wait for you then. Also, what is the correct way to get yaw roll and pitch , calibrated sensordata seems to easily drift and get orientation (once converted to Euler) gives me a constant velocity that seems to decrease overtime and increases after every movement. This may be a math error on my part though :)

Edit: for reference I have my modified psvrfreepie bridge getting orientation, turning it into yaw/roll/pitch, then sending it to freepie; trinus is setup to get rotation input from freepie.

opendata26 commented 5 years ago

Oh, just noticed this can't work at same time as psvrservice. Any suggestions of stuff that needs doing that isn't too hard @hipstersloth?

HipsterSloth commented 5 years ago

Also, what is the correct way to get yaw roll and pitch , calibrated sensordata seems to easily drift and get orientation (once converted to Euler) gives me a constant velocity that seems to decrease overtime and increases after every movement.

Which controller are you using? The older PSMove controller with the MiniUSB port or the newer PSMove Controller with the MicroUSB port? The newer controller doesn't have a magnetometer which makes it more prone to drift. I'm still working on the an optical drift correction for the newer model. Also I wouldn't recommend trying to compute euler angles from the sensor data, unless you wanted to write your own orientation filter. Though you would be duplicating the work that PSMoveService already does in it's sensor fusion code. The best bet is to get the orientation quat from the controller pose:

https://github.com/cboulay/PSMoveService/blob/master/src/psmoveclient/PSMoveClient_CAPI.h#L241

and convert that quaternion to euler angles:

http://www.euclideanspace.com/maths/geometry/rotations/conversions/quaternionToEuler/

Also doesn't Trinus support getting the PSVR orientation from the headset itself, rather than from FreePIE? All you really want to get from PSMoveService via FreePIE is the position of the headset (the glowing bulb you have attached to the headset).

Oh, just noticed this can't work at same time as psvrservice.

Yeah PSVRTracker will eventually be a PSVR specific alternative to PSMoveService so you won't run both. It would actually be a hassle to make both run at the same time because there isn't a good way to access the hardware in a non-exclusive way.

Any suggestions of stuff that needs doing that isn't too hard @HipsterSloth?

Thanks for offering! At the moment one thing I can think of would be converting the old test_camera program over to using the new PS3EyeVideoDevice class from the old PSEyeVideoCapture class

https://github.com/HipsterSloth/PSVRTracker/blob/master/src/tests/test_camera.cpp

The old PSEyeVideoCapture class was a bridge between the PS3EYEDriver library and OpenCV cv::IVideoCapture interface. I re-wrote my own PS3Eye capture class using my own USB manager that can use either LibUSB or WinUSB.

The main PSVRTracker library now uses the newer PS3EyeVideoDevice class, but I haven't gotten around to updating the test_camera tool yet. This is used as a quick test app to verify that all connected PS3 cameras are working correctly without spinning up the full PSVRTracker library.

opendata26 commented 5 years ago

Hey, I have the newer controller. But I'm talking about the headset there, as the orientation test seemed to have less drift than trinus, but I couldn't understand how exactly the gethmdorientation quat was applied. When I converted to Euler it made it so it moved when I moved my head but not stop when I stopped my head and instead just slowed down gradually.

Sure, I will have ago at the test camera thing, I did notice that it didnt seem to work with ipisoft when debugging some camera issues.

@HipsterSloth

opendata26 commented 5 years ago

@HipsterSloth how do you think it would be best to do, should I just duplicate camera code like it was done before or maybe split camera part out into a lib so both can use?