OSVR / OSVR-General

Catch-all project for issues and information not specific to a single repo
10 stars 7 forks source link

Omnifinity ODT #4

Closed peroht closed 6 years ago

peroht commented 8 years ago

We have recently released our Omnifinity Unity API to allow direct control of a Unity based on the Omnideck 6treadmill. Very simply put, (although custom) it is based around a set of trackers for viewpoint and body positioning in a 2D plane. This is similar to what you guys have found out on your own in the locomotion specs. I have given some comments in there recently.

Despite the fact my HDK's are giving me a pain with black screen I am trying to learn all about how to align and migrate this with the work done in the OSVR ecosystem. It comes down to the locomotion interface, and for games that are not designed for VR (e.g. old FPS-games) it is based around joystick/keyboard emulation. Gesture/pose detection would obviously be fed and we have some algorithms for detection of this (however IP). I will read up on the gesture/pose interface you propose.

Any help is appreciated. Generally speaking, making the Omnideck 6 support OSVR is not a hard task and we will hopefully have this in place before the end of the year together with you guys.

VRguy commented 8 years ago

Peter,

We'd be very interested in your comments regarding the locomotion spec. Other OSVR partners (Virtuix and Cyberith) are also looking at this. Once a group of vendors creates a standard interface, it will be easier to get content providers to write applications for it.

We should also look at an analysis plugin to convert locomotion data into joystick/keyboard. This will make it even easier to support games.

Yuval

On 11/20/2015 4:07 PM, Peter Thor wrote:

We have recently released our Omnifinity Unity API to allow direct control of a Unity based on the Omnideck 6treadmill. Very simply put, (although custom) it is based around a set of trackers for viewpoint and body positioning in a 2D plane. This is similar to what you guys have found out on your own in the locomotion specs. I have given some comments in there recently.

Despite the fact my HDK's are giving me a pain with black screen I am trying to learn all about how to align and migrate this with the work done in the OSVR ecosystem. It comes down to the locomotion interface, and for games that are not designed for VR (e.g. old FPS-games) it is based around joystick/keyboard emulation. Gesture/pose detection would obviously be fed and we have some algorithms for detection of this (however IP). I will read up on the gesture/pose interface you propose.

Any help is appreciated. Generally speaking, making the Omnideck 6 support OSVR is not a hard task and we will hopefully have this in place before the end of the year together with you guys.

— Reply to this email directly or view it on GitHub https://github.com/OSVR/OSVR-General/issues/4.

peroht commented 8 years ago

Yes I agree. We'd be able to adapt our API quite straightforward in the two stage rocket I have planned, 1) VRPN support - almost done 2) an OSVR build of a demo game (please note me if you have got an FPS game in your mind).  After step 1 is done I will look more closely on the OSVR way of handling VRPN and the interfaces you have in mind.  I'd be willing to push a prototype of this before the end of the year and demo it at our partner outside London in a city called Milton Keynes. CheersPeter ThorFounder of Vicator www.vicator.comCTO of Omnifinity www.omnifinity.se Twitter: @peterthor_se  @vicator_labs, @omnifinity1  Skype: peterthor_se

-------- Originalmeddelande -------- Från: Yuval Boger notifications@github.com Datum: 2015-11-22 16:05 (GMT+01:00) Till: OSVR/OSVR-General OSVR-General@noreply.github.com Kopia: Peter Thor peter.thor@vicator.com Rubrik: Re: [OSVR-General] Omnifinity ODT (#4)

Peter,

We'd be very interested in your comments regarding the locomotion spec.

Other OSVR partners (Virtuix and Cyberith) are also looking at this.

Once a group of vendors creates a standard interface, it will be easier

to get content providers to write applications for it.

We should also look at an analysis plugin to convert locomotion data

into joystick/keyboard. This will make it even easier to support games.

Yuval

On 11/20/2015 4:07 PM, Peter Thor wrote:

We have recently released our Omnifinity Unity API to allow direct

control of a Unity based on the Omnideck 6treadmill. Very simply put,

(although custom) it is based around a set of trackers for viewpoint

and body positioning in a 2D plane. This is similar to what you guys

have found out on your own in the locomotion specs. I have given some

comments in there recently.

Despite the fact my HDK's are giving me a pain with black screen I am

trying to learn all about how to align and migrate this with the work

done in the OSVR ecosystem. It comes down to the locomotion interface,

and for games that are not designed for VR (e.g. old FPS-games) it is

based around joystick/keyboard emulation. Gesture/pose detection would

obviously be fed and we have some algorithms for detection of this

(however IP). I will read up on the gesture/pose interface you propose.

Any help is appreciated. Generally speaking, making the Omnideck 6

support OSVR is not a hard task and we will hopefully have this in

place before the end of the year together with you guys.

Reply to this email directly or view it on GitHub

https://github.com/OSVR/OSVR-General/issues/4.

— Reply to this email directly or view it on GitHub.

peroht commented 8 years ago

YuvalSo where do you want me to start?  Where are the gaps? What are the key requirements that you are lacking?  What's the status of the other participants suggestions? What's the time line? Do you have a public use case that we can elaborate around? for example for 1) a "pre-vr" FPS game requiring keyboard/mouse/joy emulation based on an analysis plugin or for a fully fledged vr fps where body movement naturally moves the character? There will be obvious system layer hardware algorithms for us/the other treadmills. Thus the analysis plugin is a good start - if you have not already defined it? (I have had little time to look yet. Just thought I'd ask about your general feeling and focus). CheersPeter ThorFounder of Vicator www.vicator.comCTO of Omnifinity www.omnifinity.se Twitter: @peterthor_se  @vicator_labs, @omnifinity1  Skype: peterthor_se

-------- Originalmeddelande -------- Från: Yuval Boger notifications@github.com Datum: 2015-11-22 16:05 (GMT+01:00) Till: OSVR/OSVR-General OSVR-General@noreply.github.com Kopia: Peter Thor peter.thor@vicator.com Rubrik: Re: [OSVR-General] Omnifinity ODT (#4)

Peter,

We'd be very interested in your comments regarding the locomotion spec.

Other OSVR partners (Virtuix and Cyberith) are also looking at this.

Once a group of vendors creates a standard interface, it will be easier

to get content providers to write applications for it.

We should also look at an analysis plugin to convert locomotion data

into joystick/keyboard. This will make it even easier to support games.

Yuval

On 11/20/2015 4:07 PM, Peter Thor wrote:

We have recently released our Omnifinity Unity API to allow direct

control of a Unity based on the Omnideck 6treadmill. Very simply put,

(although custom) it is based around a set of trackers for viewpoint

and body positioning in a 2D plane. This is similar to what you guys

have found out on your own in the locomotion specs. I have given some

comments in there recently.

Despite the fact my HDK's are giving me a pain with black screen I am

trying to learn all about how to align and migrate this with the work

done in the OSVR ecosystem. It comes down to the locomotion interface,

and for games that are not designed for VR (e.g. old FPS-games) it is

based around joystick/keyboard emulation. Gesture/pose detection would

obviously be fed and we have some algorithms for detection of this

(however IP). I will read up on the gesture/pose interface you propose.

Any help is appreciated. Generally speaking, making the Omnideck 6

support OSVR is not a hard task and we will hopefully have this in

place before the end of the year together with you guys.

Reply to this email directly or view it on GitHub

https://github.com/OSVR/OSVR-General/issues/4.

— Reply to this email directly or view it on GitHub.

russell-taylor commented 8 years ago

One simple way to hook into VRPN would be the vrpn_AnalogFly class, which could take analog values from the treadmill and turn them into tracker values. If that works, you may only need to produce analog values and wrap with that.

peroht commented 8 years ago

RussellThanks for the comment. Yes that is a good suggestion. I'll explain our approach to get some more help on which path to take in short and long term. I've done what I write below using a set of regular vrpn_Tracker already. Guess it makes sense to abstract it as an analog joystick since that is essentially what the treadmill is. I didn't read too close though to see the vrpn_Analogfly and am still learning VRPN. Any conversions to "pre-vr first person body" keyboard and mouse emulation is on a separate layer. I assume this is what is called ... hmm.. Analysis (?)-layer in OSVR?(No computer access right now and my memory might fail me). So.. here goes. Apologies if English is my English/use of terms is confusing. The position of the human on the treadmill driving the character controller in a simulation is vector based. Tied to this are a number of local trackers that'd represent other trackables - such as the viewpoint,  limbs of the body any any tool the user is equipped with. These have local absolute values - reference to the outside tracking system. One of the trackers will define the actual character controller position and is combined with some movement algorithms of the treadmill to calculate a global world coordinate target position. This is later simply used as a velocity vector in the simulation character controller to do its collision detection around. I'd be wanting to supply absolute/velocity - based values since physical and virtual movement should correspond. Or do I have to handle normalised joystick values? Would this be fully be externalised from OSVR or how should I go about it regarding benefits from the semantic syntax and configuration in OSVR? I'm a bit overwhelmed by the task / boundaries between VRPN and OSVR and have only so much time.. Thanks for any help,Peter

-------- Originalmeddelande -------- Från: Russell Taylor notifications@github.com Datum: 2015-11-25 23:53 (GMT+01:00) Till: OSVR/OSVR-General OSVR-General@noreply.github.com Kopia: Peter Thor peter.thor@vicator.com Rubrik: Re: [OSVR-General] Omnifinity ODT (#4)

One simple way to hook into VRPN would be the vrpn_AnalogFly class, which could take analog values from the treadmill and turn them into tracker values. If that works, you may only need to produce analog values and wrap with that.

— Reply to this email directly or view it on GitHub.

russell-taylor commented 8 years ago

I think I'm getting a better picture of what your complete system looks like. We might want a Skype call to hash things out, but I'll take a stab at it here first.

I would describe your system as having a number of trackable objects within a physical room-space. This room-space happens to be on top of a treadmill, whose purpose is to move that room space in the world (navigate). I would anticipate the treadmill adjusting the room-to-world transformation and the other trackers behaving more like a standard VR set of trackers, calibrated and configured within OSVR to all be in the same room space.

Taking the treadmill first, if you already have a vrpn_Tracker interface for it then it doesn't make sense to redo this in terms of a vrpn_AnalogFly. I'd just leave it as it is.

Taking the other devices, there are a couple of ways to go. One is to put them all into the same vrpn_Tracker object and have them be separate sensors. This is how several existing motion-capture systems work. If there are not buttons or analogs or other devices on each, this is probably the simplest.

If you have other input devices on some of these, and they each have different ones, you may want to go with the approach that the Fastrak/Isense tracker uses, where you have a base tracker that handles the tracked-only objects that are always present and then have a separate device for each of the optional (and different) devices. It uses a wand and a stylus as optional additional devices, and can have more than one of each.

In any case, the OSVR configuration files will describe several things: (1) the geometric layout of the trackers in space (calibration); (2) aliases describing the purpose of each (/me/head, etc.); and (3) semantics and aliases describing the meanings of the various controller buttons or analog inputs. It provides a semantic meaning for each VRPN device.

peroht commented 8 years ago

Russel Thanks for the reply and the insight it gives.

The treadmill can be driven by a single or by multiple markers depending on the user scenario. I'd assume that this tracker data should be compiled into a tracker with multiple channels. Then the other devices would be another set or trackers on different ports - right?

Yes, let's schedule a Skype call to take this further. Do you have any availability before Christmas?  Just add me on Skype and we'll can up. Skype: peterthor_se

-------- Originalmeddelande -------- Från: Russell Taylor notifications@github.com Datum: 2015-11-30 16:45 (GMT+01:00) Till: OSVR/OSVR-General OSVR-General@noreply.github.com Kopia: Peter Thor peter.thor@vicator.com Rubrik: Re: [OSVR-General] Omnifinity ODT (#4)

I think I'm getting a better picture of what your complete system looks like. We might want a Skype call to hash things out, but I'll take a stab at it here first.

I would describe your system as having a number of trackable objects within a physical room-space. This room-space happens to be on top of a treadmill, whose purpose is to move that room space in the world (navigate). I would anticipate the treadmill adjusting the room-to-world transformation and the other trackers behaving more like a standard VR set of trackers, calibrated and configured within OSVR to all be in the same room space.

Taking the treadmill first, if you already have a vrpn_Tracker interface for it then it doesn't make sense to redo this in terms of a vrpn_AnalogFly. I'd just leave it as it is.

Taking the other devices, there are a couple of ways to go. One is to put them all into the same vrpn_Tracker object and have them be separate sensors. This is how several existing motion-capture systems work. If there are not buttons or analogs or other devices on each, this is probably the simplest.

If you have other input devices on some of these, and they each have different ones, you may want to go with the approach that the Fastrak/Isense tracker uses, where you have a base tracker that handles the tracked-only objects that are always present and then have a separate device for each of the optional (and different) devices. It uses a wand and a stylus as optional additional devices, and can have more than one of each.

In any case, the OSVR configuration files will describe several things: (1) the geometric layout of the trackers in space (calibration); (2) aliases describing the purpose of each (/me/head, etc.); and (3) semantics and aliases describing the meanings of the various controller buttons or analog inputs. It provides a semantic meaning for each VRPN device.

— Reply to this email directly or view it on GitHub.

rpavlik commented 6 years ago

Not sure what the current status of this is - closing because it seems stale.

I'm now at Collabora, but still involved in VR, OSVR, and the Khronos XR standard, so if there's still appetite there still might be a way to get stuff done. (I think the locomotion interface did actually get merged into OSVR so all the device would need is a driver, but given that it's a new interface type there isn't much software using - that would be the advantage of just treating it like a tracker and parenting the user's head and hand tracker to it, transparent utility in any app.)