gusmanb / PSVRFramework

GNU Affero General Public License v3.0
195 stars 37 forks source link

Sensor fusion data #9

Open gusmanb opened 8 years ago

gusmanb commented 8 years ago

I'm having a really bad time trying to get a fusion algorithm to work, I've found the sensor datasheet and I belive I got the correct settings through experimentation (everything is on the wiki) but it always drifts a ton, so there must be something wrong.

There is this player: https://github.com/Swordfish90/PSVRVideoPlayer

It has implemented the fusion code, i would like to check it and see if it drifts or not, but it doesn't works on my computer.

@mungewell could you try it please? If it works, can you sample raw sensor data (with the headset steady having the visor flat) and parsed sensor data before being sent to the fusion algorithm? With that I can compare with my data and see what's wrong.

cercata commented 8 years ago

What sampling rate are you using for gyroscope sampling?

Also, have you look at how freepie makes it for diferent devices (Android, WiiMote, etc)? https://github.com/AndersMalmgren/FreePIE/tree/master/FreePIE.Core.Plugins/SensorFusion

cercata commented 8 years ago

From what I'm seeing, you are reading the sensors from USB in a process, and sending it RAW by UDP, and the VideoPlayer is reading them from UDP.

Before trying to compensate the drift, you have to try to have it minimal, and for that you need to integrate at maximal frecuency, for having the minimal integration error.

I would do the integration at high frequency (1000Hz), and then send by UDP the integrated data (and also raw if you want).

I guess after that you will have much less drift, and then start thinking in the fusion.

gusmanb commented 8 years ago

Hi @cercata.

Thanks for the info. The question is the process is fast enough, it's being updated at 1Khz even after the integration, so I suppose thats not the problem, and no, I don't do the integration on the player, I'm doing it on the toolbox side, the code isn't published as it's not working .

I belive something is wrong with the sensor data, yesterday after finding which sensor it uses I saw the reading was raw sensor reading, I mean, that sensor returns a 12 bit complement of two value for the accelerometer data, so to it be complement it's displaced four bits and no one is taking that in account on all the codes I've found everywhere, I'm a bit puzzled.

I think I got right the accelerometer settings, I've checked it, when the device is flat the sensor reads an acceleration of ~+1024 units plus a bit of drift (max +- 20 units) on the X axis, I watched a unit teardown and the device is mounted vertically, that matches perfectly the ± 2 g 1024 LSB/g configuration (all the info I got is on the wiki page about the sensor report), but I'm 100% lost with the gyro data, according to the datasheet it's a 16 bit value complement of two, but I have no clue on how to check which configuration has been used as I can't add a constant known torque to verify it. I belive it's configured for ± 250°/s: 131.2 LSB/°/s, that makes sense, a person will not turn faster than 250º per second and this way the sensor has a very good resolution.

Something strange on the reports is each report has two sets of values, but that doesn't make sense at all, at first I thought it was because the device was reading the sensors faster than USB could send reports and they decided to send two readings per each report to achieve 2Khz, but that's not possible as the device has a max frequency of 1Khz.

The most curious is the drift only happens on one axis, the X axis, it's like if the gravity is affecting the gyro readings, which doesn't makes sense at all as the gyroscope should not be affected by gravity from what I know, maybe I'm wrong.

For the integration I've tried a ton of implementations being the one on the linked player the one which yielded best results, with it I got something nearly stable, but it drifts on the X axis, I extended it with a calibration routine to compensate the sensor inaccuracies, but that did not make any difference at all. Also, the user which created the player set the update interval to 120Hz in the integration code which is crazy, on my side I replaced the fixed value with a measurement with an stopwatch, that's why I know it's doing the integration at 1Khz, they're reading the sensor data at real time and it sends updates at 1Khz with two readings per report, a total of 2Khz, so that makes me think that code will not work at all, I can't try the player as it doesn't works on Windows (at least for me) so there I'm also a bit lost.

Any info would be really appreciated.

I will doublecheck it today if I have time.

cercata commented 8 years ago

Why do you say the Max Frequency of the Device is 1Khz ? That would be in filtered mode, but in unfiltered mode it can reach 2Khz How do you in which mode is PSVR using the sensor ?

The drift may be related to the "gyroscope offset compensation" integrated in the BMI055 ?

I'm lost man, good luck !!!!

For the accelerometer data, I wouldn't mind, if people don't shift it, they get the whole 16 bits, they will be getting 16384LSB/g with some random noise because of the lower undefined 4 bits. They aren't just as methodical as you.

mungewell commented 8 years ago

Nothing much to contribute, other than to say awesome post! Love the tech details.

Q. Are you reading the sensor data in Cinematic Mode or VR Mode? Slight chance that 'they' are messing with it.

The PSMoves contain a calibration data section, which might be use on the PC to compensation for per-device flaws. Don't know if PSVR has a similar table, They certainly need some way to tune/improve behaviour of a per device level. https://github.com/nitsch/moveonpc/wiki/Calibration-data

gusmanb commented 8 years ago

@cercata well, the BMI tech specs at the Bosch page specifies a max frequency of 1000Hz: Bandwidths (programmable) 1000Hz … 8 Hz maybe I misunderstood it or it's wrong on the tech specs?

@mungewell That's a really nice info! I'm for sure there must be a report like the one on the move to read the calibration, I will try to read reports tomorrow ( today I'm extremely busy), that may be just the missing bit of info.

cercata commented 8 years ago

The question is, the data published on USB comes from the unfiltered or from the filtered streams ?

"Two different streams of acceleration data are available, unfiltered and filtered. The unfiltered data is sampled with 2kHz"

_" IMU Data Gyro ... ... Unfiltered (high-bandwidth) data can be read out through the serial interface when the data_highbw (GYR 0x13 bit 7) is set to ‘1’."

"The gyro processes the 2kHz data out of the analog front end with a CIC/Decimation filter, followed by an IIR filter before sending this data to the interrupt handler. The possible decimation factors are 2, 5, 10 and 20. It is also possible to bypass these filters, and use the unfiltered 2kHz data."

http://www.mouser.com/ds/2/783/BST-BMI055-DS000-08-786482.pdf

MaVe-64 commented 8 years ago

Just my 2 cents, but maybe HipsterSloth kan be of some help about this, PSMoveService uses gyroscope, magnetometer and accelerometer data from the Move controllers.

gusmanb commented 8 years ago

@cercata ok, I missed that part. We still can't know if it's filtered or not, maybe if we find a calibration status report there will be something, anyway that makes sense, it should be set at 2Khz and sending two reads per packet, that would give a meaning to a byte sequence sent before each sensor status, it's just a byte so that may be perfectly an offset respect to the previous reading, I will test it tomorrow.

@MaVe-64 Good to know, I will try to contact him tomorrow.

gusmanb commented 8 years ago

Did another try yesterday and today with 0 luck, also tried to contact HipsterSloth to see if he can help a bit but must be occupied as he didn't answered.

I've uploaded the tests to the mouse_emulation branch, if someone with knowledge about this matter can check the code it's here: https://github.com/gusmanb/PSVRFramework/tree/mouse_emulation

The important code is on BMI055Parser.cs (correct sensor data parsing), PSVRMouseEmulator.cs (base class for mouse emulation) and Mahony.cs (AHRS algorithm).

mungewell commented 8 years ago

Looked at the HID sensor data and think I know where you are going wrong.

HID does weird things with numbers due to bit sequence in stream... the accels are 12 bit with 4bit padding (padding is constant regardless of rotation).

This shows up as:

41 43 7f fc 41 0e
            *_ ** Accel = 0x00e4
      *_ ** Accel = 0xffc7 (first f implied as negative)
*_ ** Accel = 0x0434

I also noticed that the timestamps are rather 'routine', so I think these are in microseconds. sensor.txt

gusmanb commented 8 years ago

Unless there's something I dont see that's exactly how I am parsing the accel data:

accel = ((short)(((short)RawData[AccelOffset + 1] << 8) | RawData[AccelOffset]) >> 4) ;

I've feeded the function parsing the data with the example values you posted and the results are the same, 0x0434, 0xffc7 and 0x00e4.

Also, the accelerometer values I get seem to match with the gravity, when the hmd is "flat" I get a 1g force on +X, if I put it upside-down I read a 1g force on -X and so on.

mungewell commented 8 years ago

I can't find that section of code, perhaps I'm looking in the wrong place.... https://github.com/gusmanb/PSVRFramework/blob/mouse_emulation/PSVRFramework/PSVR.cs#L100

But if you have already fixed it, that's great. Simon.

gusmanb commented 8 years ago

At line 135 and 136 of PSVR.cs I call to a class named BMI055Parser, that class parses the raw byte reading and applies the bit displacement and scaling to the values, it's here.

As these were only tests I preserved the raw readings and added these "filtered" (it should be named scaled) readings to the report.

gusmanb commented 8 years ago

HipsterSloth has been really kind and took a look at the code and found some terrible mistakes made by me, so we can be on the right track to get this working, hope to get something working for the next week. I've advanced a lot on the VR player (corrected projections, added support for 180º mono, 360º mono, 180º stereo and 360º stereo videos, multiploe screen support, etc etc) and with this the player can be finally useful.

mungewell commented 8 years ago

Excited to try it, a simple/small utility to bring 360 pictures/videos to masses on PSVR will be awesome.

mungewell commented 8 years ago

Maybe jumping the gun; but once you get the sensor fusion working what protocol are you intending to communicate this with?

I see that there is UDP for FaceTracknoIR/OpenTrack... would this make sense?

gusmanb commented 8 years ago

That's the idea, OpenTrack, I already have on my computer the UDP broadcaster for it, I'm waiting to have the data working to commit to the repository. Hope to get tomorrow enough time to check HipsterSloth's hints, I'm now implementing a barrel shader on the player to correct the lens distortion.

gusmanb commented 8 years ago

Finally got it!!!

I need to tweak the calibration procedure, but it works, very stable and fast. There were two main reasons why I wasn't getting this to work, first of all I did the very big mistake of taking the sensor data and feed it to the Madgwick filter as degrees instead of radians :(. And also, Microsoft some times is... well, Microsoft, all we know what they did with IE ignoring every standard. A quaternion is defined as { w, x, y, z }, a scalar and a vector, but for some reason the person who wrote the numerics add on decided it was more nice to sort the constructor params as (x, y, z, w), very fun....

So finally I will sleep very deeply this night, this issue was killing me XD.

gusmanb commented 8 years ago

I've corrected the wiki about the timestamp fields, it's 4 bytes not three but the fourth byte is unused, the counter rolls back on 0x00FFFFFF, I think that's because the unit internally uses a signed integer and they avoid the last byte to skip negative numbers. It has been really useful for the integration, even without the madgwick filter just creating a quaternion with the correct scales and these timestamps the control gets totally steady, and is very curious to see the real time lapses, it's not at perfect 2Khz, it varies between 504us and 435us, so these little differences in time do a big difference on the steadiness of the integration.

mungewell commented 7 years ago

So do I get to demo this to my kids at the weekend? hint, hint...

gusmanb commented 7 years ago

I got it implemented on the VR player, it works fine, I will to try to upload it tomorrow (player + toolbox), I'm now on the middle of a very big revamp. I decided to get into creating a SteamVR driver, that is going to be the best integration with other apps (and games :D ) and it's easier than I thought, I already have the skeleton working, it is recognized by Steam and tries to initialize the device. Having the distortion and the pose seems very simple to create it, but for a good integration among everything I need to separate the control from the toolbox. Now there is going to be a windows service (it's also compatible with Linux) controlling the psvr and any application willing to acces it connects through TCP to the service (and no more json, this tim e pure binary to be efficient). In this way the toolbox can be connected, the VR player can be connected, the SteamVR driver can be connected and any third party can connect to it.

For the toolbox I'm going to have two types of compilations, one with the server com and other standalone in case someone doesn't wants to install the service (but they will lose the ability to use the player and the SteamVR driver, so I doubt much people will use it :D).

If you want to try the VR player before I upload the binaries, the published master branch contains all the changes, you must compile the toolbox for any cpu and the player for x64, it should work. Note that when you start the toolbox or power on the headset it will blink during ~3 seconds, during these the headset must be stable, it calibrates the sensor error range, after that you can reset the initial pose using the recenter command on the toolbox or double tapping the hmd (the tapping doesn't works too well as I changed the parsing routine, need to adjust it a bit better, just try some times if it fails ;) ).

mungewell commented 7 years ago

Unfortunately I don't have a build environment for C# set up... if only you had released before the big revamp. ;-) Don't forget the mantra "release early, release often"

So, whilst acknowledging that this is your project, I think at total jump to SteamVR would be a mistake and make the 'simple utility' at little too complex. I believe that there are plenty of people would would just want rotation tracking for watching 360 movies or playing conventional games ala TrackIR protocol.

Without positional tracking (and controllers) the SteamVR experience is going to be somewhat lacking (and potentially nauseating) as there will not be any parallax effects as your head moves side-side.

Anyhow, I'm still here to see if we can find anymore useful details about driving the PSVR.

gusmanb commented 7 years ago

Ok, I think I didn't explained myself in a clear way. The toolbox isn't going to lose any of it's capabilities, one of these is the mouse emulator. I've been thinking on how to implement it right, I just need to get the way to compensate the roll, I think it must be first creating a vector (X = pitch max view º,Y = yaw max view º,0) and then transforming it with a quaternion with quat.FromEulerAngles(0,0,roll) or just create a vector (0, 0, 1000), transform it with the orientation quaternion, trace a ray from 0,0,0 and instersect it with a plane representing the screen, it's a bit more complicated than it seems, I don't want a sensibility setting to trim the mouse position/sensibility, I want to achieve a "look at" effect, wherever you look, the mouse places there, and that can't be achieved just using pitch/yaw as X/Y like every body is doing (the view is spherical, transforming rotation to X,Y leads to distortion of the position, in plain words, it will be too slow at the center and too fast at the edges for the cinematic mode). The only thing it's going to be removed from the toolbox is the sensor broadcasting, but I will think a way to intercomunicate the toolbox with the VR player if the installed version is the standalone one (no service, only toolbox with direct access to the device).

Also, I'm not leaving behind the player nor the toolbox in favor of SteamVR but I want to have at least the orientation fitting the SteamVR world setup, I don't want to just use my own coordinate system as I'm doing, this would lead to the need of transforming my world setup to other coordinate systems and that's time consuming, so the idea is to get the orientation on the steam driver, correct my world setup and then finish the toolbox with mosue emulator and VR player.

About the steam VR experience, it's the same experience you get with the cinematic mode, you have the orientation from the HMD and the distortion corrected, I usually play in this way, sitting in a chair and using the gamepad for movement, it's true some people finds it nauseating, but it's a matter of training yourself (I am one of these which gets no nausea at all, lucky me, one of the games I love to play is Windlands and I do sessions of hours, if anyone handles that without nausea then is totally immune to dizziness XD ).

Finally but not least important, this isn't only my project, I'm the one coding it. but without your help this wouldn't got so far, so it's our project ;)

I'm attaching the toolbox + player with orientation, I just compiled it, no time to test it right now as I'm at work :). For the player, uncompress an old version and place this new one over it, I'm not including the VLC plugins folder as it is too heavy to attach it on a post.

Some notes, the initial orientation can be wrong, so may be you need yo look at your side, recenter the view and then look at the front. Also, there's no roll, that's something I need to investigate as I'm not sure if that can be achieved on stereoscopic videos (on monoscopic videos works like a charm but I've removed it for all the modes as this is jsut a test) I will add an issue to see if someone can help a bit with that, on the stereoscopic videos as each eye looks at a different point in the space, if you roll the camera the two views start to diverge, it's normal, and I'm not sure if this can be compensated, I've looked at other players like MacMorpheus and they do the same, remove the roll component but I'm not happy with it, I think there must be a way to correct that, something like using roll combined with sin/cos to compensate pitch/yaw and converge the views again.

Hope this works, if no I will try to add tomorrow a new compilation.

Toolbox.zip Player with orientation.zip

yanvv commented 7 years ago

I also don't get motion sickness in games that allow you to move with a controller while seated and I love it. This is why I also really look forward to be able to play elite dangerous with the ps vr and thanks to you guys we are getting there so thank you for spending your time on this! :D I think a steam vr plugin will be a great way to easily do this as you don't need to do al these things like using reshade and stuff. It's would basically become almost plug and play. So awesome!

gusmanb commented 7 years ago

@yanvv Yeah, I also want to see me inside my Cobra MKIII, I have ED Horizons and is one of the games I most want to play on VR, that and Skyrim special edition.

cercata commented 7 years ago

Awesome !!!! Fucking amazing !!!!! I understand Project CARs would work as well ????

If someone told me this the 13th October, I wouldn't believe him.

mungewell commented 7 years ago

So I showed this too my kids... getting some stutter on the sensor fusion, but otherwise it's really impressive. Audio still not working and a few other glitches, but wow!

Might have told a few people ;-) https://www.youtube.com/watch?v=UsDIli79Qss

gusmanb commented 7 years ago

Cool video Simon!

I see a lot of stuttering on your PC, on mine orientation works very fluid, but that may be by vlc or just my PC being faster, not sure which specs has your laptop, mine is a i72660K@4.0 with 16Gb and a 1070 xtreme, so that's why I was asking about performance reports, my machine is not a good one to test it XD. On that version I introduced a mechanism on the VR player which syncs the ogl buffer read/writes with the VLC writes, and it works very badly, on the previous version were no lock at all but that causes tearing, and seeking for advice on how to solve this all the answer had been "don't use LibVLC". If it's by the sockets or CPU usage the new system is a lot faster and lightweight.

I already rewrote the toolbox (and reduced a lot the interface), the framework and the service, and I even have the SteamVR driver coded (I'm going to upload it next week, I don't want it to be published until it works), but now I need to test everything and check if all my suppositions were right :).

The new system is extremely efficient in comparation to the previous one, first of all the connection is no more UDP, now is TCP to avoid losing any update. The updates are controlled by the clients, from 1 to 255ms, for a game/vr player 150-200 updates per second are more than enough, that's 10 times less than what it's being broadcasted now. Finally but not least, instead of using JSON I'm serializing the data manually on binary format, commands are just 2 bytes, status report 3 bytes + serial number chars and input report (buttons + pose) 14 bytes, small and efficiently composed.

If it works then I will return to the VR player and rewrite the video code to use GStreamer and see if it works better, I got some recommendations about it and seems to be easy to use.

mungewell commented 7 years ago

Jittering comes and goes, so maybe something else on the system affecting it. Also had the thought about the dual graphics card - is this 'just' handled by the system, or would you code have to do something special to cope?

gusmanb commented 7 years ago

On the VR player it's already handled, when you choose the output monitor GLEW opens the graphic card which has the monitor attached to. On the steamvr driver, I'm still not sure, is one of the points I must check, the documentation is extremely poor (at least the one I found) and the examples only include a null driver which does a pass-thru to the screen. If we can find an OpenVR driver code for the oculus or the vive that would be of great help.

MaVe-64 commented 7 years ago

I can't wait for an update with StreamVR tracking, got SteamVR screen pass-through already working. If only you could upload a temporary tracking driver for SteamVR.

gusmanb commented 7 years ago

Hey @MaVe-64 then you got the driver with the video set-up? Could you send me it to compare with mine? I have found info on how to implement tracking but not on how to implement the video, what I programmed about video has been more intuition than knowledge, and any example would be great. I'm starting to test right now the new service and toolbox and after that I will test the driver (I programmed everything at once with no test... XD). Also may be if I get today it working in some way someone to test it besides me can be of great help, if I get it I will try to contact you privately to conduct some tests if you're available.

JonnyMnemonic commented 7 years ago

@gusmanb Hi, i tried player with orientation in post above and orientation doesn't work. Driver WinUSB via zadig. Everything else work fine.

MaVe-64 commented 7 years ago

Well, I only know that since I tried the hack from Trinus I now have a full screen of the SteamVR SBS output. For details guess that Trinus guy can tell you more. I would love to do some testing for you. I have a PSVR with original firmware 1.5.

I'm trying to attach the drivers but it won't let me, even though it's a zip file.

gusmanb commented 7 years ago

@MaVe-64 ok, I thought you coded the driver. No problem, I will try to find the info anywhere else.

PomanoB commented 7 years ago

@JonnyMnemonic try to enable UDP sensor broadcast server in PSVRToolbox settings. All works for me, but same jittering problem. And no sound, even if I change setVolume(vctx, 0); to setVolume(vctx, 100); in Engine::run.

JonnyMnemonic commented 7 years ago

@PomanoB Tried, all defaults 255.255...nothing. Maybe router blocks? But according to wiki sensor data goes through usb.

gusmanb commented 7 years ago

@JonnyMnemonic Sensor data comes to the system through USB, but the toolbox broadcasts it as UDP so any program can read it without messing with the USB. Also, the router can't block this, it's on your local IP, only thing which can block these packets is the Windows firewall.

Also, this was just an experimental build meant for mungewell, if it doesn't works for you, just don't use it and wait for a release, I'm rewriting everything, so please be patient.

MaVe-64 commented 7 years ago

It didn't work for me too. After uninstalling and installing the drivers it worked. Do you have the blinking HMD leds when you start up the tool?

JonnyMnemonic commented 7 years ago

@MaVe-64 No, i reinstall drivers from zadig,same. It just lights 2 leds behind, like it always did from early builds. Also i have PSVR firmware 2.0 why you didn't update? So i'll wait for release.

Izzard-UK commented 7 years ago

@gusmanb Did you have a play with TrinusVR for PSVR? https://www.google.co.uk/amp/s/amp.reddit.com/r/PSVR/comments/5e7xkb/trinus_psvr_play_steamvr_games_on_psvr/

One of the commenters there has suggested how to tweak the SteamVR driver to better suit the PSVR distortion etc. I know you've figured out some pretty accurate measurements yourself and I'm wondering if this helps you fold them into your SteamVR driver.

mungewell commented 6 years ago

@mick-p1982 Glad you have a inquisitive mind and are playing with PSVR.

BTW did you play with Vridge-API? I am messing with my own project to inject the data from PSVR into VRidge...

m-7761 commented 6 years ago

@mungewell Glad to see you active here. Gusmanb rebuffed my inquiries through email, making every attempt to dissuade me from working with the PlayStation VR, as if I can just trash it and spend thousands of dollars on toys! But I can still leave notes here, to help others out...

Like I should probably make an issue about the Wiki here needing to be more clear about the 2 sensor reports, being samples, and not 2 sensors. This was a misconception that stayed with me ever since I began working with it, weeks or months ago.

I've had good success with it, though I have misgivings about the OLED display. I think I could not afford to have experience with VR any other way.... I mean I could buy a lot of bourgeois crap, but I would never be able to respect myself, or make good use of it! I am completely down with carrying on the work here, and unimpressed that more has not been done to improve its compatibility with PC...

https://dylanmckay.io/psvr-protocol/ is doing recent work. I'm adding an early VR demo/feature set to Sword of Moonlight, that is for making 3D first-person stories for video games. It has a long lineage back to possibly the first modern 3D video game. Today its legacy is Dark Souls, and the samurai game From Software is making for the PS4. I have everything working, except for camera based tracking, and I don't know if the color will ever be satisfactory for recreational use, but artists can sure benefit from seeing their work really come to life.

The PSVR has swallowed up a lot of my work hours, so there's no way I could cover all product lines that exist, and I suspect that most will not work with my workstation. Whereas, I knew the PSVR would because it's more like a commonplace PC. It would be perfect if it had better color reproduction, and would do reprojection in hardware, since normal people don't own 120hz monitors or video adapters with that bandwidth. But it's for a console after all, and I just don't know of any PC companies taking comparable products to market.

I think VR is many years off, and I hope an OpenXR works out to be more like OpenGL than EGL, i.e. able to make VR devices as software friendly as video cards. In the meantime, I will settle for PSVR or anything that is friendly to general audiences. I think Sony has a lot more muscle to flex, and so can do things that small PC companies are unable to.

gusmanb commented 6 years ago

Sorry but I have blocked mick, that guy has been insulting me on private emails and I don't want such kind of people to use my work.

Cheers.