OpenKinect / libfreenect2

Open source drivers for the Kinect for Windows v2 device
2.06k stars 747 forks source link

kinect timing #869

Open timprepscius opened 7 years ago

timprepscius commented 7 years ago

This is really a question, not an issue.

Does anyone know if the kinect frame rate is determined internally (inside the kinect), or is determined somehow via interactions with the computer.

Here is what I'm seeing. It is super confusing for me. Sort of difficult to even describe.

I have two computers. One with near zero clock skew. One with noticeable clock skew. I measure this clock skew using a std steady_clock, take measurements off of a time server. This stead_clock on the computer with clock skew doesn't align with reality. (it skews). I also measure the kinect frame rate using the same stead_clock.

The kinect, on both computers, is coming in at 30 fps.

At this point I believe that there is a problem with the time synchronization code. I just need to stare at it more. But I thought I would ask if the kinect is somehow being told what 30fps by the computer it is connected to.

Clock skew of computers: Orange good computer, blue bad computer.

time-blue-silver-osx-orange-elitebook-osx

Both computers have near exact 30fps of kinect infrared.

xlz commented 7 years ago

A frame is probably triggered by USB clock. So:

determined somehow via interactions with the computer.

clock skew

Install ntpd. It usually give sub-msec accuracy.

timprepscius commented 7 years ago

Ok, I've been reading tons about USB packets host/device protocol.

(If anyone else is interested here they is what I'm reading: http://www.usbmadesimple.co.uk/ums_6.htm http://www.beyondlogic.org/usbnutshell/usb3.shtml http://wiki.osdev.org/Universal_Serial_Bus#Function.2FHost_Response_Circumstances https://en.wikipedia.org/wiki/USB https://en.wikipedia.org/wiki/Spread_spectrum )

Now lots more things make sense to me. Especially the packet number of kinect. It seems that those packet numbers are actually usb frame/microframe packet numbers. The usb declares a microframe will have an interval of 125nano seconds.

This is probably obvious to others, but wasn't to me.

So I have a couple questions I'm searching for, haven't found:

  1. Is there an interrupt/packet sent to the kinect which tells it when to start the next pic/transfer?

I think not, but I wonder?

  1. Is there any way which the USB clock frequency could be tied to the monotonic system clock of a computer (ie std::chrono::high_resolution_clock or monotonic_clock).

I'm searching for how the kinect 30fps seems tied to the high_resolution_clock of the computer. Even when that high_resolution_clock experiences large amounts of drift (10 milliseconds per 6 minutes).

So far no docs declare where the clock frequency of the usb host is created? Is it independent from the other system hardware? Or is it driven by main system somehow.

I believe, but I'm not totally sure, that the NTP will only change the wall clock of the system, and not the underlying monotonic clock, to which the kinect seems to be tied. Will be investigating more thoroughly tonight.

-tim

xlz commented 7 years ago

You are in the right direction but treading deep water now. Also, It's 125 microseconds.

To start you need to stop referring to std::chrono stuff which is an opaque abstraction and I have no idea what it actually is and start using clock_gettime() - CLOCK_READTIME, CLOCK_MONOTONIC. Windows and Mac have different timing systems so it's not comparable.

I believe usually the USB controller would include a clock source in itself but perhaps Linux's USB stack can make some adjustment to it?

floe commented 7 years ago

If you have 10 ms drift in 6 minutes, that means about 27.7 µS of drift per second. One frame takes 33.3 ms, so the drift per second is less than 1/1000 of a frame duration, and about 1/5 of a USB microframe. I doubt that you will ever be able to consistently measure that sort of microscopic timing jitter.

timprepscius commented 7 years ago

Yeah. It will mean that I will have to resynchronize the system every 5 minutes or so, to get the ir cameras to be time-synced the way I want them.

Sort of lame. But not too lame. :-/

I would be cool if I could tell the usb controller to introduce a 1 micro-frame lag every X micro-frames. --- I could calculate the drift over X minutes. Then calculate how many frames I need to lag every Y frames. Evenly distribute the lag. This way I wouldn't need to change any system timing. Just skip a frame request from time to time.

I wonder if this would even work. Anyhow. Seems like a horrible kludge from any angle.