Closed NPTP closed 4 years ago
If you zoom in your output, it seems that some pixel in the background looks a little yellowish when it should be blue. After looking at the top left corner of the image I'm wonder if you forget to interpolate some channel of the bottem-left corner of the Bayes pattern? Just a guess...
Also, if you open your image with the reference image in some editor and compare the values of RGB, you'll get some interesting findings 😆
Ahh, I had both green pixels performing the exact same task, that's why it's pulling the wrong colours. I was wondering why one of them was going wrong and the other seemed correct. Thanks for the help.
Hi,
From my understanding, we assume a BGGR pattern in the "bayer" array. When we are on a B pixel, for example, we use only the single intensity sample from that pixel for our B value in the "rgb" array. Then we find as many neighbours of the other colours as we can (maximum 4 per colour, minimum 1, for example the top left blue pixel which only has 1 Red neighbour it can sample), average them, and use that average for the intensity of that colour in the rgb pixel.
I'm doing this and getting the very mosaic-y result below, and I'm really not sure why after poring over my code. Looking for any insight... have I misunderstood the process? Thanks!