fabio914 / RealityMixer

Mixed Reality app for iOS
GNU General Public License v2.0
776 stars 63 forks source link

MRC network protocol questions #64

Closed SanderVocke closed 3 years ago

SanderVocke commented 3 years ago

Hi there,

First of all: amazing job on this app!

Hope you don't mind me framing this as an "issue", but this seems like the best place to ask this question and have the info be easy to find and public. Maybe it will be useful to others as well.

I have looked at your source code and that of the Oculus MRC plugin, and I have some questions about the protocol being used to talk to the Quest both in calibration and in capture mode.

First of all, I was wondering how you were able to reverse engineer the calibration payloads that should be sent to the Quest. There doesn't seem to be anything about this in the open-source Oculus software. Did you use a network sniffer for this?

I noticed that in calibration mode, an XML is eventually sent to the Quest containing the calibration result (camera extrinsics and intrinsics). On the other hand, in capture mode there is another type of message sent to the Quest to update camera extrinsics (position, rotation) only. My questions about this:

I understand of course that you may not know the answers to these, in which case may experiment a bit. I think it would be really cool if we could describe the whole network protocol surrounding MRC as accurately as we can, so that other projects could easily benefit. For example, I would really dig an application that uses this without any real camera at all, but just allowing a user on a mobile device/laptop to manipulate the position and properties of the virtual camera and watch/record the resulting background stream.

Thanks in any case!

fabio914 commented 3 years ago

Did you use a network sniffer for this?

You can instead extract the apk and take a look at how it works.

In calibration mode and capture mode, are two different protocols being used? Or does the XML payload also work in capture mode, and the CameraPosePayload also work in calibration mode?

These are separate protocols. You can take a look at the Oculus plugin for OBS to check how the capture protocol is implemented, and you can take a look at the OVRExternalComposition.cs file from the Oculus Integration package for Unity to check how the external calibration is used by the apps and how the foreground and background images are generated.

This CameraPosePayload is my own and it's not part of the capture protocol, this is so I can have a "moving camera" while capturing. Here are some examples: Open Brush, Cubism, PecoPeco. There's an example CameraTestServer file in this repository that shows how to add this functionality to a Unity app that uses the Oculus Integration package. Keep in mind that this is very rudimentary.

This OVRExternalComposition.cs file also has some hints about a possible "moving camera" as part of the capture protocol, but I don't think this was implemented.

Is there a payload type similar to the CameraPosePayload, but for intrinsics such as camera matrix, which can be used in capture mode?

You'd need to implement this, and then implement something in your Quest app to receive these updates and override the calibration in runtime. It'd be somewhat similar to how this CameraPosePayload and the CameraTestServer work, but you'd be overriding the intrinsics instead of the extrinsics.

I think it would be really cool if we could describe the whole network protocol surrounding MRC as accurately as we can.

This blog post from Oculus has some details about how the capture protocol works.

For example, I would really dig an application that uses this without any real camera at all, but just allowing a user on a mobile device/laptop to manipulate the position and properties of the virtual camera and watch/record the resulting background stream.

This is totally doable, you can even do it with OBS and the Oculus plugin itself (you can remove the camera source from the OBS scene), however you'd need to have your own mechanism to send and update the external camera position. You can take a look at @jonathanperret's mrc-client as an example if you're thinking of writing your own app instead of using OBS.

SanderVocke commented 3 years ago

Thanks a bunch for the detailed answer!