MicrosoftDocs / mixed-reality

Mixed Reality documentation
Creative Commons Attribution 4.0 International
176 stars 186 forks source link

Perception Simulation for HoloLens 2 ("HTTP request failed: CSRF Token Invalid") #528

Open LimingRenPA opened 2 years ago

LimingRenPA commented 2 years ago

Background info: We are exploring how to create some automated UI tests for our HoloLens 2 app.

I have a HoloLen2 device which is connected to my PC ( fully updated win 10 pro). We also enabled Developer Mode on the HoloLens.

The device portal with https works fine after downloading the HoloLens certificate (as attached pic). However with correct HoloLens user name and password,

RestSimulationStreamSink.Create

returns exception {"HTTP request failed: CSRF Token Invalid"} `

(Please scroll down to see a screenshot).

My questions: (1) Does Perception Simulation work with real HoloLens 2? (2) Is Perception Simulation a good way to create automated HoloLens UI tests? (2) If yes, what did we do wrong?

Sincerely,

image

image

[Enter feedback here]


Document Details

Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.

LimingRenPA commented 2 years ago

And I can do RestSimulationStreamSink.Create to HoloLens 2 emulator on this machine successfully.

pbarnettms commented 2 years ago

Hello,

To answer your questions:

  1. Yes! Perception Simulation works with both a physical HoloLens 2 and the HoloLens 2 Emulator.
  2. Yes! It's a great way to create automated tests.
  3. It isn't yet clear where the problem is. Does it work if you specify http instead of https in the connection uri in your code? Also, which IP address are you using -- the device's WiFi address or the UsbNcm address? Does it work if you specify 127.0.0.1:10080 (assuming you have the USB device support component of Visual Studio installed)
LimingRenPA commented 2 years ago

(1) The call works with "http://127.0.0.1:10080/" if the device is connected to my PC using a cable.

(2) I used the ip address from the hololens' device portal URL, the same IP if I get from HoloLens setting-> Wifi -> Advanced options -> Ip v4 address.
(3) Http gives error "HTTP request failed: Temporary Redirect".

I got the same error using wither UsbNcm or Wi-fi (device portal->system->networking): image

pbarnettms commented 2 years ago

Thank you for the additional details. I am investigating. This is expected to work regardless of WiFi or USB. I hope to have more information in the next 1-2 days.

pbarnettms commented 2 years ago

Just a quick update. I'm continuing to investigate. It appears that the token is not being captured by the client when using https, so it is not being included in subsequent requests which results in the error when using https. The token cookie itself hasn't changed -- it is still being provided by the server, and it continues to be captured by our native C++ code, but the managed client appears to lose it. This could be a security-related change in the .NET Framework. I'm hoping to come up with a trivial workaround but do not have one yet.

LimingRenPA commented 2 years ago

Another issue, after connecting to my REAL HoloLens 2 using "http://127.0.0.1:10080" and run some code to move my right-hand, the HoloLens device stops responding to any hand gestures (even after reboot).

My code:

                    var protocol = "http";
                    emulatoreIpaddress = "127.0.0.1:10080";

                    var uri = new Uri($"{protocol}://{emulatoreIpaddress}");
                    sink = await RestSimulationStreamSink.Create(
                        // use the IP address for your device/emulator
                        uri,
                        // no credentials are needed for the emulator
                        new System.Net.NetworkCredential("???", "????"),
                        // normal priority
                        true,
                        // cancel token
                        token);

                    var manager = PerceptionSimulationManager.CreatePerceptionSimulationManager(sink);
                    manager.Reset();

                    // Activate the right hand
                    manager.Human.RightHand.Activated = true;
                    var human = (ISimulatedHuman2)manager.Human;
                    var rightHand = (ISimulatedHand2)(human.RightHand);
                    float x = 0.05f;
                    float y = 0;
                    float z = 0;
                    rightHand.Move(new Vector3(x, y, z));
                    Thread.Sleep(2000);

and this is what I see, but I cannot enter the pin (Hololens not responding to any hand movements).

image

Thanks,

pbarnettms commented 2 years ago

This suggests that you left the device in Simulation mode. Your code should restore the original Control Mode upon exit. To manually change it, go to the Simulation page in Device Portal and set the Control Mode dropdown to 'Default'.

LimingRenPA commented 2 years ago

You are right, thanks

LimingRenPA commented 2 years ago

We are seeing exceptions after a few tries (the same code and finger press and release, to a real HoloLens 2 device).

(0) The device is default mode, (1) RestSimulationStreamSink.Create works, (2) Then any call will generate an exception as in below pic. The exception message is

{"Exception of type 'System.Exception' was thrown."}

and no inner exception.

image

The device stays in this state (exception whenever perception simulation code is used). But I can use the device when it is in control mode.

Let me know if you need anything.

image

typride commented 2 years ago

@LimingRenPA any updates on your end? Were you able to resolve this issue on your own or still need further assistance?

LimingRenPA commented 2 years ago

Nothing has changed on my side, thanks.

felixshing commented 1 year ago

I also face the connection issue when trying to use perception simulation in my real Hololens 2. I solve this problem in this way:

I disable the "SSL connection required" option in the Preferences of windows portal and then use HTTP connection instead of HTTPS

Moreover, using USB connection also works. Just change the url to ``http://127.0.0.1:10080/'' as the previous discussion. Hope this helps.

pbarnettms commented 1 year ago

This issue is addressed in the preview release: https://learn.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/perception-simulation-standalone

felixshing commented 1 year ago

This issue is addressed in the preview release: https://learn.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/perception-simulation-standalone

Yeah I also try to use the preview release. It works fine. But it looks like it cannot meet my requirement. I want to write an autotest tool for my application, which needs head movement and gaze movement in the headset. But preview release can only control the movement via the keyboard and mouse. It cannot either record the actual movement or use code to control the movement. Am I right? Or actually the preview release can do that but I miss it?

pbarnettms commented 1 year ago

The preview release has everything that's in the existing product and adds more - it does not remove anything. You can still control head, hands, eyes, controllers, etc. via API calls or manual input.

felixshing commented 1 year ago

The preview release has everything that's in the existing product and adds more - it does not remove anything. You can still control head, hands, eyes, controllers, etc. via API calls or manual input.

Sounds great! Can I record my physical movement wearing the headset and playback it? I find the preview can only record the keyboard- and mouse-control movement but cannot record my physical movement when I wear the headset

pbarnettms commented 1 year ago

Recording using physical headset movements is not enabled in the current preview build. Please stay tuned!

felixshing commented 1 year ago

Recording using physical headset movements is not enabled in the current preview build. Please stay tuned!

Thanks. So if I want to play back my physical movement, I should collect the sensor data first, then convert it into the format of perception simulation to play back my movement. Is it right?

We will collect the head 6DoF position (x,y,z, yaw, pitch, raw), gaze position (x,y,z), and gaze direction (X,Y,Z). Can I replay this movement using perception simulation?

pbarnettms commented 1 year ago

That is correct - you can capture the data yourself and create a Simulation recording that will play back. You can create the recording using the Simulation APIs - PerceptionSimulationManager.CreatePerceptionSimulationRecording to create the recording, then use the movement APIs and pass in the data from your capture. The resulting .xef file can be played back using either the user interface or PerceptionSimulationManager.LoadPerceptionSimulationRecording.

felixshing commented 1 year ago

That is correct - you can capture the data yourself and create a Simulation recording that will play back. You can create the recording using the Simulation APIs - PerceptionSimulationManager.CreatePerceptionSimulationRecording to create the recording, then use the movement APIs and pass in the data from your capture. The resulting .xef file can be played back using either the user interface or PerceptionSimulationManager.LoadPerceptionSimulationRecording.

Thanks! I am a newbie of C# and am confused about how to call the API. I am looking at this API documentation: https://learn.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/perception-simulation-overview

Now I want to use ISimulatedEyes to get/set gaze data. The ISimulatedEyes is in ISimulatedHead2. To use ISimulatedHead2, the documentation says Additional properties are available by casting an ISimulatedHead to ISimulatedHead2

But I don't know how to case ISimulatedHead to ISimulatedHead2. I am using the example code in https://learn.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/perception-simulation-overview. And the code uses ISimulatedHead just in this way:

IPerceptionSimulationManager manager = PerceptionSimulationManager.CreatePerceptionSimulationManager(sink);

manager.Human.Head.Rotate(new Rotation3(0.04f, 0.0f, 0.0f));

Where should I cast ISimulatedHead to ISimulatedHead2 and how to do that?

felixshing commented 1 year ago

That is correct - you can capture the data yourself and create a Simulation recording that will play back. You can create the recording using the Simulation APIs - PerceptionSimulationManager.CreatePerceptionSimulationRecording to create the recording, then use the movement APIs and pass in the data from your capture. The resulting .xef file can be played back using either the user interface or PerceptionSimulationManager.LoadPerceptionSimulationRecording.

Thanks! I am a newbie of C# and am confused about how to call the API. I am looking at this API documentation: https://learn.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/perception-simulation-overview

Now I want to use ISimulatedEyes to get/set gaze data. The ISimulatedEyes is in ISimulatedHead2. To use ISimulatedHead2, the documentation says Additional properties are available by casting an ISimulatedHead to ISimulatedHead2

But I don't know how to case ISimulatedHead to ISimulatedHead2. I am using the example code in https://learn.microsoft.com/en-us/windows/mixed-reality/develop/advanced-concepts/perception-simulation-overview. And the code uses ISimulatedHead just in this way:

IPerceptionSimulationManager manager = PerceptionSimulationManager.CreatePerceptionSimulationManager(sink);

manager.Human.Head.Rotate(new Rotation3(0.04f, 0.0f, 0.0f));

Where should I cast ISimulatedHead to ISimulatedHead2 and how to do that?

Just try the following code and it looks like it works. But I could not feel any gaze movement when I wear the headset. Could you help me find the reason?

ISimulatedHead head = manager.Human.Head;
ISimulatedHead2 head2 = head as ISimulatedHead2;
ISimulatedEyes eyes = head2.Eyes;
eyes.Rotate(new Rotation3(0.04f, 0.17f, 0.88f));
Thread.Sleep(2000);

eyes.Rotate(new Rotation3(0.94f, 0.0f, 0.17f));
Thread.Sleep(2000);
eyes.Rotate(new Rotation3(0.24f, 0.3f, 0.17f));
Thread.Sleep(2000);
eyes.Rotate(new Rotation3(0.1f, 0.4f, 0.17f));
Thread.Sleep(2000);
pbarnettms commented 1 year ago

There is no visual indicator in the headset when moving the eyes unless the application you're running on the headset has visual feedback of eye movement (i.e., the system has no built-in eye gaze indicator).

Also note that if you'd prefer to use C++, the preview version of Perception Simulation now includes headers and libs for C++ and the documentation page includes a code sample in C++. All functionality available in C# is available in C++. (The C# API is basically a wrapper over the C++ API.)

felixshing commented 1 year ago

There is no visual indicator in the headset when moving the eyes unless the application you're running on the headset has visual feedback of eye movement (i.e., the system has no built-in eye gaze indicator).

Also note that if you'd prefer to use C++, the preview version of Perception Simulation now includes headers and libs for C++ and the documentation page includes a code sample in C++. All functionality available in C# is available in C++. (The C# API is basically a wrapper over the C++ API.)

Thank you very much! I would like to ask if is it possible to control the head and eye position? I notice that in the ISimulatedHead and ISimulatedEyes interfaces, we can only set the rotation but not position. But we want to control the 6DoF movement of head and gaze. How could we do that?

pbarnettms commented 1 year ago

The eyes are attached to the head and the head to the body, so if you move the ISimulatedHuman (Manager.Human.Move()), the head and eyes move with it in the world. You cannot detach the eyes from the head or the head from the body just as with a real living human.

pbarnettms commented 1 year ago

Note that if you are looking to modify the human characteristics, such as changing the distance between the eyes for purposes of changing the stereo rendering parameters, you can do that via ISimulatedDevice2 (Manager.Device as ISimulatedDevice2).DisplayConfiguration