andymanic / OSRTT

OSRTT (Open Source Response time tool) is a complete LCD monitor response time testing solution made as easy and accessible as possible.
https://andymanic.github.io/OSRTTDocs/
Other
55 stars 4 forks source link

Input Lag #3

Closed alex-i-1 closed 1 year ago

alex-i-1 commented 1 year ago

Input Lag - what is it? There is no info about that on https://andymanic.github.io/OSRTTDocs/ For my understanding, it could begin from: 1) data sending through the cable has began, 2) data sending through the cable has ended. And it could end at: 1) visual change at 10%, 2) visual change at 80%, 3) visual change at 90%. Or whenever it begins and ends. Googling "Click to Photon Latency" didn't help on that either. Single photon? Bunch of photons? That's not a good self-descriptive name. Is it a subtraction of matrix response time from on-display latency? What is it? The On-display Lag is actually a vague thing too. And thanks for starting an open source project.

andymanic commented 1 year ago

While I will update the docs soon, the topic of "input lag", "input latency", "click to photon latency", "total system latency" or "end to end latency" is nowhere near a new term in the world of monitor testing. It's something most if not all technical reviews cover, and is a pretty standard - and I thought - obvious metric. You are measuring how long an input takes to be displayed. There is variance in how you measure that, whether you measure the end-to-end (click to photon, total system) latency, ie the time between sending an input and it being shown on screen, or the on display latency, ie how long it takes for the monitor to take in a new frame, run any post processing (including overdrive), then display it.

You mentioned you googled "click to photon latency" - but said you didn't find anything useful. I searched it and found my own video (linked here), Linus' video (linked here), and even a full article on a remote desktop software's website explaining, with graphs, what the metric covers (linked here).

If you are after specifics on my tool, I have the board measure the USB polling delay, then measure the time between the input being registered with the USB controller, and the display starting to transition from black (RGB 0) to white (RGB 255). I also have the desktop software running some custom DirectX 11 code record the frame time when an event is registered, which can later be subtracted, along with the USB polling time, to give the on display latency. Each of these steps is recorded and displayed though, should you want to quote one specific metric, or get an idea of the bigger picture.

Hope that clears things up.

alex-i-1 commented 1 year ago

That "www.ni-sp.com" site is what I read for first, - it is the one which doesn't describe what "click to photon" means. Simply because it doesn't say what is being counted as the end of the measurement. It describes that moment with the word "detected", but even graphs there are showing that this could mean any point during the slope, which is as long as 1-10 ms (such an error shouldn't be acceptable). Except that the point of beginning needs not to be described: it simply is a mouse click or its simulation. There are two more sites with the data about monitors: www.rtings.com and www.kitguru.net Again, they don't say what they mean by the names of their measurements. They though clearly describe one: "Rise / Fall Time" (which is being measured between 10% and 90% points, and obviously represents the matrix response time (note that this data is also useless to us, because a smaller "matrix lag" doesn't guarantee for us that the "sending through computer->monitor cable + in-monitor data processing + sending data to matrix + matrix lag" sum will be smaller)).

Looking at the videos, I see that there yet are some bits of information. In the beginning of first mentioned video, the measurement is shown as: mouse click - light sensor detection. And I agree with you there (or you agree with me; or we both coincide 😊) that this method, which has too many additional variable lags, should not be used. What could I say? Use a separate computer to set up Linux on it, and do Arduino stuff there (as I did: set up Arduino IDE; though the microcontrollers I'm using aren't Arduino). There you could inject/hack into open source video drivers, and catch the moment when sending of a frame data begins. Windows is really not fitting for that kind of job. The amount of efforts to be used on Linux is really lower. In fact, on Windows I simply dropped to do almost each thing (which required kernel-level) I wished to do, and if I didn't drop but yet try to do it, that costed me a lot of extra time, groundlessly.