Open m-7761 opened 6 years ago
Glad to hear there is another dev interested in looking at this!
As I mentioned in the PSMoveService issue this project was created as a testbed for trying to get PSVR Tracking working and to serve as a reference for other devs trying to do something similar or use this project as a 3rd partly library (what ever makes more sense for their target project). To be clear there is still quite of bit of work to do, but I think most of the fundamentals are in place at this point.
PSMoveConfigTool has an HMD section, but doesn't see the PSVR. How does its discovery work? Can it be lightweight, like PSVRToolbox?
Mostly. In order to communicate with the PSVR headset over USB you need to install either a LibUSB or WinUSB driver for the two command interfaces (4 and 5). You can use Zadig to do this or make a prebuilt installer using libusbK-inf-wizard. I'm trying to make PSVRTracker support using either WinUSB or LibUSB so that in theory the library could be cross platform. Though in practice I'm focusing on Windows functionality first.
Does your code work with PS3 cameras? (Some recent commits say PSEye code is removed.) I happen to have two. But don't have a PS4 camera.
It works with both PS3 (via WinUSB) and PS4 cameras (via windows media foundation once firmware is bootloaded onto the camera).
The big thing I'm still working on is getting multi camera tracking working again. PSMoveService's multi-camera tracking works by triangulation, but this assumes that each camera frame arrives at the same time, which isn't quite true. I'm trying to accomplish the same result but instead rely on the sensor fusion code to do the job.
I'm happy to answer any questions you have about progress, implementation details, etc!
Great! Just for context, I maintain (it's a full time job) an impressive 3D RPG Maker software that From Software published in 2000 (it's not really professional grade by itself, but it's for Windows, so expansion is relatively straightforward.) From Software makes the Dark Souls games today. The maker software is unique because it comes with a tacit license to print/sell King's Field games, which is the original game series for the PlayStation.
I'm in the process of scrabbling together a port of https://en.wikipedia.org/wiki/King's_Field_II that is VR enabled with the goal of 2020 release. It's the 20th and 25th anniversary of the products respectively.
VR works via the PSVR/PSVRToolkit, but I'm going to have to develop a standalone version of the PSVRToolkit because the author is not uploading new builds, and the current Release sends wrong sensor data over the UDP server feature. I want to enable use of Sony's peripherals on Windows and roll it all into a configurator program that also provides a UI to the various INI files that involved. I exchanged some emails with its author, who was very soured on the idea of PSVR. They believe from what I can gather that it's not "real VR" and see no future for it. Whereas I've always aimed to make 3D software available to the general public via inexpensive computers and so see the PSVR as the most likely way to reach the widest audience possible, and I feel like it doesn't need exotic computers to host it, and only costs $150.
Anyway, I'm unimpressed with the OLED display, but have been able to make the picture and sensors to work very well. I thought tracking would be necessary, but I was able to stabilize the sensors by using a much higher Beta factor for the Madgwick solver. It creates a very chaotic picture but does not drift. I haven't yet tackled decoupling the picture from the raw sensor data, but in theory it should be trivial. (VR gets me sucked in, and I have to swear it off for a while, and so work with it off and on. Lately it's been too warm to dawn the headset.)
WinUSB driver for the two command interfaces (4 and 5).
I have WinUSB installed via PSVRToolkit. I assumed that "Failed to open MorpheusHMD(USB\VID_054c&PID_09af" meant that some API like SteamVR was not installed. What I am missing?
I'm hopeful position tracking is within reach. PSVRConfigTool doesn't find the set. The wiki says:
Open %appdata%/PSVRSERVICE/HMDManagerConfig.json
Set "virtual_hmd_count" to 1
Now I see a menu for the HMD button. (Never mind, why not default to 1?) Is following https://github.com/HipsterSloth/PSVRTracker enough to get a demo going? I will definitely be poring over the source code. I was going to try to modify the PointTracker code from FreeTrackNoIR. The OpenCV trackers did not look useful to me. I know everyone uses OpenCV, but I felt like I could probably get further just programming something from scratch than navigate its API/code base. That may seem naive, but the basic problem of filtering a black/blue image seems more approachable to me than finding a needle in OpenCV's proverbial haystack! SO OBVIOUSLY I'm pretty relieved to learn of your effort :)
EDITED! Am I crazy, or does PSVRTracker not need to access the PSVR's USB device at all? I mean, in theory it just needs to see lights in the camera? Is this right? I will focus on cameras now :)
I'm trying to accomplish the same result but instead rely on the sensor fusion code to do the job.
Editing: So the tracker does need input from the USB device/sensors? Or it can? BTW: I am wrestling with how to set up both cameras in the meantime. Just for the record, does the current implementation use only 1 camera? ("The big thing I'm still working on is getting multi camera tracking working again.")
I have WinUSB installed via PSVRToolkit. I assumed that "Failed to open MorpheusHMD(USB\VID_054c&PID_09af" meant that some API like SteamVR was not installed. What I am missing?
That error message (assuming it's coming from PSVRService) means that it can't open the sensor or command interface for the PSVR headset using the WinUSB driver. There are a few possible reasons for this.
If PSVRToolkit is running at the same time and is connected to the the PSVR headset then PSVRService won't be able to connect to the USB interface (only one program can connect to the USB device at a time).
If that's not the case then we should verify that WinUSB is actually the driver you have installed for the PSVR interfaces (and not LibUSB).
1) Connect and turn on your PSVR headset
2) Launch Zadig (https://zadig.akeo.ie/)
3) Select Options > List All Devices
4) Make sure both PS VR Interface 4 and Interface 5 have WinUSB installed:
Now I see a menu for the HMD button. (Never mind, why not default to 1?) Is following https://github.com/HipsterSloth/PSVRTracker enough to get a demo going? I will definitely be poring over the source code.
Sorry about this. That wiki is super out of date. The virtual HMD config was intended for use back when I didn't have USB communication working with the PSVR headset and was experimenting with purely optical tracking. I will probably be removing that option at some point in the future. In theory you could just do the purely optical tracking option but there are some trade-offs. More on that in a sec
I was going to try to modify the PointTracker code from FreeTrackNoIR. The OpenCV trackers did not look useful to me....
It depends on what you are trying to do. There is definitely a lot to the OpenCV API. The "Learning OpenCV book" + many StackOverflow articles were a huge help to me. If you ultimately decide not to use OpenCV you'll need a image processing library that can do HSV color filtering, camera image undistortion and calibration, image segmentation and contour generation, and finally SolvePnP.
EDITED! Am I crazy, or does PSVRTracker not need to access the PSVR's USB device at all? So the tracker does need input from the USB device/sensors?
In theory you can do multi-led tracking without using sensor data (Accelerometer + Gyro) BUT that can result in a lot more tracking instability. By using the sensor data as an input into the optical tracking model you can make a better initial guess about the pose of the headset when trying to compute a best fit projection. This is particularly important when you can only see 3 or fewer lights.
You could certainly start with this if you wanted though. This was the intended use case of the VirtualHMD. Don't actually connect to the PSVR headset but instead due pure optical tracking and rely on the client code to do the sensor fusion. I haven't tested that code path in a while so it's possible there are some things I broke as well.
Just for the record, does the current implementation use only 1 camera?
Correct. You can connect multiple cameras, but you'll get incorrect tracking data because I haven't ported over the camera pose calibration code from PSMoveService yet. It's part of the config tool that has you place a PSMoveController at 5 locations on a piece of paper and uses SolvePnP to figure out where each camera is relative to the calibration paper. This was the next feature I wanted to port over though.
VR works via the PSVR/PSVRToolkit, but I'm going to have to develop a standalone version of the PSVRToolkit because the author is not uploading new builds, and the current Release sends wrong sensor data over the UDP server feature. I want to enable use of Sony's peripherals on Windows and roll it all into a configurator program that also provides a UI to the various INI files that involved.
If you wanted to stick with using PSVRToolkit and port over the optical tracking code I have written in PSVRTracker you are certainly welcome to. Though I'm not sure if there is a C# version of OpenCV so that may or may not be a great path forward. I tried to make the client API of PSVRService a C99 style interface so that it would be easy to make a C# wrapper. Not sure...
One thing I should also stress: PSVRTracker is still very much a work in progress. While tracking kind of works for either a stereo camera (PS4 Camera) or a mono camera (Logitech C920 or ps3eye) there is still a bunch of work to do:
And finally I should mention I'm pretty swamped right now trying to get the new PS4 Move controller (the one with the micro usb connection) working in PSMoveService and getting some stuff done for work. So it might be a few weeks before I can make any meaningful progress on any of the aforementioned tasks. But I'm more than happy to answer questions in the meantime about what I've written so far.
Is Zadig a necessary component? I saw libusb in Device Manager earlier today. I generally installed WinUSB (4 & 5) but may have tried libusb once (I think it didn't work) but I did not see WinUsb in Device Manager, and so guessed that the libusb lines were WinUsb.
It really sounds like you're on top of everything. I may have been way over my head if I tried my hand at it. I just imagined given an image, I could exact meaningful information from it, with relatively little code. In my experience, very direct code--just do it yourself instead of seeking out off the shelf solutions--almost always works out for the best, and potentially saves a lot of shopping around, and produces code with maybe only a small source file instead of a behemoth external dependency. But what's worked in the past isn't necessarily guaranteed to always work. It's just a strategy for when I cannot find elegant solutions to software problems. It is reinforced by the fact that after I do something with mammoth external code, always I begin to have regrets after about a year, until I realize it's time to bite the bullet and implement its function from scratch. That always works out for the best, and it's very easy to sleep after.
With graphics, I've yet to see a problem that is actually complicated. Graphics look impressive, but are really very simple problems, that just involve little operations that are performed millions or billions of times, but at their heart they are child's play. And compared to some very mundane problem domains that seem like bookkeeping or UI, these domains are very personalized problems, and so seem very outwardly simple, but are actually very time demanding to implement, and are that way because no one has ever found a way to reduce these problems. Moral of the story is you can't judge the complexity of a problem in code by how it appears to end-users.
Here (http://www.swordofmoonlight.net/bbs2/index.php?action=dlattach;topic=286.0;attach=928;image) is a picture I took earlier today. It is a scene from a recreation-in-progress of what I consider to be the best specimen of a 3D story space in existence from 1995. It looks simple. In fact, too simple with modern--very solid--3D graphics. I'm committed to publishing this port of this game with native PlayStation VR on Windows. So I'm in it for the long haul. 2020 is the 25th anniversary. The original is not well known, so I hope that it will be seen in a new light (if noticed at all) in 2020. Its predecessor, I believe, happens to have been the first title to share all of the qualities of modern 3D video games. It's not lightning in a bottle like part 2, but I think the 3 part series is of historical interest, and is also prototypical of how we imagined VR games in the 90s. 2020 is a ways away. But unless OpenXR comes along and changes the landscape, I don't see myself looking at other avenues for doing VR on Windows. I only began the work of porting this game (by mining the original disc) very recently, this summer, so I hadn't looked at it in VR until tonight. I realized the other day that even if it doesn't feel quite right with modern graphics (somehow I will find a way to compensate... already someone is working on SVG textures) that the PSVR's diminutive resolution is very much like the original PlayStation's. And so I'm less worried for VR. Other than its side-effects.
The software itself is unique. I think it will be the first competent 3D story software comparable to RPG Maker like products. It's not limited to 1995 presentation. It can do about like Shadow of the Colossus or King's Field IV, that looks very similar to SOTC. https://www.facebook.com/moratheia/photos/?ref=page_internal is the only other project that I know of using it. Their project uses all original artwork, and they are very crafty whether at art or music. But I worry their lifestyle will not allow them to ever finish/publish their work.
Aside from Sony replacing the PSVR in the meantime, 2020 is a good ways off. I hope that this software/port I'm involved with is able to garner attention and be a poster child for VR and outlet for would-be VR artists. I have faith that tracking will come online for Windows users in the meantime :)
I will definitely be around to follow your work... and lend a hand. I know that all-too-often developers work without anyone to try their work and provide useful feedback, without which bugs can linger for too long, unnoticed. I will also work on tweaks. I try to perfect my product's experience. If you want to demo this with VR, I am going to upload the project for back-up on my web server before too long. The controls are impeccable; Unlikely anything I've ever experienced. I hope to do the same for its VR element. It is a well regarded PlayStation title, even if little known.
P.S. I think a "Sony Peripherals for Windows" project would be very useful. Whether your objective here is to eventually roll this back into PSMoveService or not. I think it would be a good thing if possible to develop it into something that is like PSVRToolbox, but also with tracking. It might be putting all of the eggs into one basket, but I think that there could easily be one Control Panel style applet/driver framework to directly interface with Sony's peripherals. Especially the PSVR continues to impress, but the PlayStation itself is an albatross. (Hopefully if it/the console model dies--as it should--Sony keeps making its peripherals!)
Update
Here (https://www.reddit.com/r/KingsField/comments/9kj984/very_early_playable_section_of_kf2_for_windows/) is a quasi-demo I've pieced together since we last spoke. Direct download: http://www.swordofmoonlight.net/holy/KING'S%20FIELD%20II.zip
You may or may not be interested in it because it uses PlayStation VR as a standalone product, and it has what I believe to be much better anti-drift characteristics than what is commonly thought to be possible.
I've mainly been occupied with importing/exporting everything into Blender and back out into level geometry squares that have textures, since the data comes from a PlayStation game which has a very exotic addressing scheme that requires some manual intervention, read: a lot of time spent doing clerical work.
In the meantime I quickly developed a hybrid system that blends two side-by-side Madgwick integrators (mainly copied from existing sources) so that one is optimized for its anti-drift quality, and the other for its picture stability, so that when your head isn't moving (imperceptible movements) the more drifty/stable position is shown, until that is, it drifts too far from the other's better estimate. At which point some fudging is required to reel it in to the orbit of the more chaotic/correct approximation.
Other than this demonstration, I believe that the technology demo is close to optimal for the PSVR in terms of geometry of the simulated 3D space. I have some doubts about the perception of depth, however I believe that the settings are correct, except I suspect it's really not appealing to focus the binocular vision to a theoretical point infinitely far away. (I.e. eyeballs straight ahead.) As a result, I have lessened the separation effect until I can experiment with picking a distance to focus on. (Technically the green caves in the demo look like a Magic Eye image, and the angular cave walls look like flat teeth, using naive separation settings: that is separating the view volumes same as the distance between my eyes.)
P.S. I plan to dig into PSVRTracker before too long. I plan to not take on another single-minded project for at least a few weeks. If I don't get into VR in this period, I will before too long, surely before year's end.
Hey... reviving this old topic. I remembered you recommended Zadig here. And provided helpful instructions!
Do you think that's why (https://github.com/HipsterSloth/PSVRTracker/issues/19) I can't get the PSVR to appear? I am pretty sure I have these exact drivers installed by PSVRToolkit's installer. Are the ones in the drivers
folder not "signed"? Why is Zadig needed? Sorry.
(EDITED: I don't know if I've ever seen the source-code for these drivers. Is it in the respective projects? Or are they just perfunctory things that tie into Lib-usb?)
PS3Eye camera set up was a piece of cake thanks to the IPIsoft PSEYE Driver option (indirectly) recommended here (https://github.com/HipsterSloth/PSVRTracker/wiki/Virtual-Stereo-Camera-Setup)
Although, I'm beginning to wonder what exactly your wiki means by "virtual" since it seems to mean different things in the "Virtual HMD" case than it means in the "Virtual Stereo Camera" case. One is real cameras right? The other is not a real PSVR? Or is it? Anyway, I can't get the PSVR to appear in PSVRConfigTool. (DS4 neither.)
I don't know if I've ever seen the source-code for these drivers. Is it in the respective projects? Or are they just perfunctory things that tie into Lib-usb?
Yeah these are really just simple wrappers for lib-usb and winusb based drivers. When assigning a generic USB driver to a USB interface you can either: A) Use zadig to install a libusb or winusb wrapper for the device and interface b) Use libusbk's "USB Inf Creator/Installer" Wizard (https://sourceforge.net/projects/libusbk/) to create an installer that does the same thing. That's all the IPIsoft PSEYE Driver really is.
Although, I'm beginning to wonder what exactly your wiki means by "virtual" since it seems to mean different things in the "Virtual HMD" case than it means in the "Virtual Stereo Camera" case.
Virtual HMD = A tracked light that you attach to a real HMD. Another program reads this tracking LED data and feeds it to FreePIE so that people can add positional tracking to HMDs that only have rotational tracking
Virtual Stereo Camera = Faking a real stereo camera with two side by side mono cameras. It's not a true stereo camera because the video frames don't arrive in sync. I don't really want to support this mode anymore since it was just used as a stepping stone for true stereo camera support. Instead I want to support true multi mono camera support, which involves a calibration step of determining where the cameras are relative to each other.
Hey, I'm trying to add tracking to a standalone product. Can you tell me the lay of the land? I thought this was linked to PSMoveService, but just now found this project (https://github.com/cboulay/PSMoveService/issues/578)
The PSVRToolbox is dead now. I'm looking at implementing position tracking myself, but it looks like you've already gone the distance, so I'm wondering if there can be a replacement for PSVRToolbox that has tracking.
PSMoveConfigTool has an HMD section, but doesn't see the PSVR. How does its discovery work? Can it be lightweight, like PSVRToolbox?
Does your code work with PS3 cameras? (Some recent commits say PSEye code is removed.) I happen to have two. But don't have a PS4 camera.
In any case, I plan to dig into your code. And maybe put it to other uses. Thanks! Hope to hear from you :)