adafruit / Adafruit_APDS9960

Arduino drivers for APDS9960 gesture sensor
Other
38 stars 45 forks source link

calculateLux does not reflect APDS sensor documentation #42

Open atomdog opened 1 year ago

atomdog commented 1 year ago

if my math is right, which it probably isn't totally; double lux = (double(c) / (2360.0)) * 468.2344; would work better, where c is the clear output 'count'. I used this as a basis for calculating lux given the 'clear' sensitivity (here)

MrEbbinghaus commented 1 year ago

Can you explain, how you get the 468.2344?

I'm not sure about this either, but I thought the value should be 679.585 😃 The clear sensor's irradiance responsivity is stated as typical $2360W/m^2$ at 16x gain at 560 nm.

560nm has 99.5% (=679.585 lux) of the max 683 lux of 555nm.

atomdog commented 8 months ago

Can you explain, how you get the 468.2344?

I'm not sure about this either, but I thought the value should be 679.585 😃 The clear sensor's irradiance responsivity is stated as typical 2360W/m2 at 16x gain at 560 nm.

560nm has 99.5% (=679.585 lux) of the max 683 lux of 555nm.

Can you explain, how you get the 468.2344?

2360 counts per 1 w/m^2, 1 count per 0.0004237281 w/m^2

Divide spectrum into irradiance portions for red, green, blue, and clear. Red covers 0.46666 of total units, green 0.5, 0.46666 for blue, clear for entire 1.0. Multiply those by 683 lux. 318.7333, 341.5, 318.7333.

2(318.7333 0.466) + (341 0.5) = 468.2344. This seems to work well for me, I have a year of data and it closely follows the what I would expect the Lux value to be as a function of reported RGB-C values.

MrEbbinghaus commented 8 months ago

(I feel compelled to disclaim again that I have no idea whether I'm wrong or not.)

Your formula just uses the clear sensor, so as I understand, you don't gain the additional precision from the other sensors.

Maybe you don't even need the RGB sensors for the factor? Since the response curve of the clear sensor is already close to the human (photopic) perception curve (fine black curve). It's centred at 555nm and scaled to 100% image

atomdog commented 8 months ago

image This is my math. Forgive the cramped and strange layout. My intuition says that the response for RGB is contained within the clear value, both in the range of responsivity as well as red,green,blue light being literally the constituents of clear light. It also seems I used 570nm instead of 560nm.

I think the difference in our values comes from my incredibly scientific methodology of eyeballing the value at which C is centered.

I believe you used the dotted (what I presume is estimated) curve and I used the solid (what I presume is actual) curve.

MrEbbinghaus commented 8 months ago

Taking only the clear sensor was my first thought.

I implemented the steps from the article you linked, and it ran for the past three weeks. I also obtained any constants from this new "eyeballing" method.😄

Here is the code for the ESPHome Sensors: https://gist.github.com/MrEbbinghaus/f3f764f2f3c90bfb37fa64a43fef79b7

And if you are interested in graphs, here are some results from my living room. Looks like there is a time around noon, when the sun hits the sensor or the wall next to it. 🙂 image

I don't have a reference light sensor, but I guess it comes on my shopping list. :-)

MrEbbinghaus commented 8 months ago

But I'm not sure, if the eyeballing method from the article is good enough for this sensor. The sensitivity curves from the authors' sensor are "slimmer" and better separated.

MrEbbinghaus commented 7 months ago

I had more time to think about it. Writing things down so I don't forget it.

This is the function to calculate lux[1]:

$$ \phi_V = 683.002 \frac{\text{lm}}{\text{W}} \cdot \int0^\infty \overline{y}(\lambda) \cdot \phi{e,\lambda}(\lambda) \cdot d \lambda $$

(Where $\overline{y}$ is the dotted/solid/dashed thin curve in the diagram above. $\lambda$ is the wavelength in nm. $\phi_{e,\lambda}(\lambda)$ is the W/m

Or in other terms: $\text{Illuminance}(\text{lm}/\text{m}^2=\text{lx}) = \text{max luminous efficacy}(\text{lm}/\text{W}) \cdot \text{Irradiance}(\text{W}/\text{m}^2)$

The sensors measure irradiance. (irradiance = count / 2360 for the CLEAR sensor, but only for the settings from the table in the datasheet.)

The dotted/solid/dashed thin curve above is the response curve of the human eye. If the response curve of the CLEAR sensor would match that, we would only have to multiply the CLEAR sensor's value (in $\text{W}/\text{m}^2$) by 683 $\text{lm}/\text{W}$.

But since CLEARs sensitivity is "wider", the resulting lx value would be too high, given a uniform light spectrum. (it would be exact if only under 555nm light)

@atomdog This also means you can't just use a constant luminous efficacy (like your 468.2344) for calculating, since that would only be valid for a specific composition of light. The idea of using the RGB (and even IR) sensors is to tune the luminous efficacy according to the composition of light.


What does that mean for calculating?

We practically only measure the area of the respective response curves. Goal: The area under the "human eye response curve" (solid/dotted/dashed) curve.

So what we need to do, and what I think the author of the article tries to do to get the factor for the respective IRtoIL, is to calculate the proportion of the respective sensor curve from the "human eye curve" ($\overline{y}$).

They get: 0.3 for red, 0.9 for green, and 0.06 for blue, which is close enough for red & green, but off for blue, which I would guess is more like 0.25. image

This is relatively simple if the different sensors don't overlap much, but harder if they overlap or have weird shapes, like the BLUE channel of the APDS9960.


What I think could work: First, calculate the proportions of all combinations of areas. (like a Venn diagram) Then use those areas to "puzzle" together the "human eye curve" ($\overline{y}$).

Example: The green curve is relatively responsive in the IR range, which we want to eliminate. We also have the IR sensor(s). We can subtract the expected proportion of IR light from the GREEN sensor.

Let's say the overlap of the two sensors is a third of IR's area. We then have to subtract a third of IR's counts from GREEN's counts.

MrEbbinghaus commented 6 months ago

I got a BH1750 illuminance sensor and compared it under sunlight: image

My values for the Lux sensor are: $$\frac{C}{2360.0} 683.0 \frac{0.6 R + 0.8 G + 0.135 * B}{R + G + B}$$

float sum = r + g + b;
// if 0 < sum
return (c / 2360.0) * 683.0 * (0.6 * r + 0.8 * g + 0.135 * b) / sum;

I might tweak them over time... Need to think about how to practically calibrate them with different coloured light.