Beep6581 / RawTherapee

A powerful cross-platform raw photo processing program
https://rawtherapee.com
GNU General Public License v3.0
2.7k stars 308 forks source link

Add camconst support for foveon files #2711

Closed Beep6581 closed 7 years ago

Beep6581 commented 8 years ago

Originally reported on Google Code with ID 2729

This patch adds camconst support for foveon files.
Ilias, could you add the camconst entries for the foveon cams please?

Ingo

Reported by heckflosse@i-weyrich.de on 2015-03-30 17:13:41


Beep6581 commented 8 years ago
Something's going off track here, gentlemen.

According to RawDigger, the SD14 is quite capable of raw values up around 9,000 - so
calling the 'white level' 6,700 is a bit low don't you think?

Has anybody looked at SD14 metadata (surely they have). Saturation levels are included
there and much, much more.

I can put a meta-data dump up on my site if anyone's interested?

Ted

Reported by xpatUSA on 2015-04-09 01:08:12

Beep6581 commented 8 years ago
Here's an extract from one I already had:

CMbM:SaturationLevel
  Type=0 (long), Dimensions=1 (D0) (3)
  M[0]=7652 (1DE4)
  M[1]=7754 (1E4A)
  M[2]=7840 (1EA0)

My experience is that these values are for the best linearity and actual values written
to the card can be considerably higher. I used X3Dump by ArvoJ to get these values.

Reported by xpatUSA on 2015-04-09 01:18:55


Beep6581 commented 8 years ago
Well, I don't know about RawDigger, but I have 2 SD14s, with quite different serial
numbers, and my sensors clip around 6700, considering safe levels. Above 6700 you are
into deeply non linear territory! I've seen values slightly above 6900 for sure, but
I recommend 6700.

PS: I'm using output from dcraw.

PPS: Check the dpreview thread. I know of no raw converter that actually uses data
above 6700-6800 as "good data". That includes Adobe, Silkypix, Sigma's own SPP...

Reported by rnbc.r0 on 2015-04-09 01:54:35

Beep6581 commented 8 years ago
I will bow to your superior knowledge and take no further interest in this thread.

Good luck with the project.

Ted

Reported by xpatUSA on 2015-04-09 02:12:26

Beep6581 commented 8 years ago
At http://rnbc.dynip.sapo.pt/pub/x3f-examples-02/ you now have 2 sets of samples.

For SD14:

- Color target under sunlight.
- Lamp, overexposed at least 5 stops, with completely burned pixels.

For SD1 Merrill:

- Color target under tungsten lamp.
- Color target under direct sunlight.
- Color target under cloudy sky.
- Dark frame.
- Lamp, overexposed at least 5 stops, with completely burned pixels.

Reported by rnbc.r0 on 2015-04-09 02:15:36

Beep6581 commented 8 years ago
Ted, I certainly have no superior knowledge... your cameras might be different from
mine, who knows?...

Can you provide some samples? That would be great :)

Reported by rnbc.r0 on 2015-04-09 02:21:11

Beep6581 commented 8 years ago
According to "~/dcraw/dcraw.exe -4 -D -M SDIM8210.X3F" the white point for SD1M seems
to be 4050 (hard clipping in the lamp for all channels). Can you confirm this value
please? Thanks!

PS:

Comparing the PPM output by dcraw in the previous example (SDIM8210.X3F) with the output
of rawtherapee with white point set a 4050 it seems like rawtherapee is discarding
some properly exposed highlights. You can see that in some regions of the lamp.

This is strange... isn't rawtherapee using the values dcraw outputs? For the SD14 the
values in both programs match well, but there is something strange going on for SD1M.

PPS: Ok, I think 4050 is a good value. But please confirm :)

Reported by rnbc.r0 on 2015-04-09 03:36:53

Beep6581 commented 8 years ago
Ok, just reconfirmed, 4070 seems to be pretty good as well, for the SD1M. Oh well, forget...
:D

For the SD14 definitely looks 6700 is more appropriate. But please check. If you need
more samples, please ask!

Reported by rnbc.r0 on 2015-04-09 04:01:54

Beep6581 commented 8 years ago
I've looked at the black point of the SD1M. It's set at 16 in the latest version of
camconst.json, but I think that's a bit too aggressive. I would set it at 12, to truncate
less data in the normal distribution. I know this is debatable...

How do you calculate the black point for a sensor exactly? You set it at the average
of the dark pixels? Or 1 sigma below? or 2 sigmas below? or 3 sigmas below?

The guys using sensor data for astrophotography typically set their darks at 3 sigmas
below the average, but that's because they are going to average hundreds of frames,
and all data matters. I think for general photography if you dump 10% of the data,
that's ok, so perhaps 1.5-2 sigmas below the average?

I haven't (yet) made a rigorous statistical analysis of the sensor, but 12 seems to
dump little data and keeps blacks black, so to say :)

Reported by rnbc.r0 on 2015-04-09 04:29:04

Beep6581 commented 8 years ago
I am on a trip, far away from my PC so very limited to do any valuable work, not even
able to use RT, Dcraw etc :(

I have to report that I was not very happy with the samples I used for profiling as
they covered too much of the sensor's area and the color cast affected some periphery
patches. Please for any next sample use the 1/3 center area at most (by 1/3 I mean
if the full size is 4500X3000 use 1500X1000 for the target) rule of thirds makes this
easy :)

For black level we use the average of black frame .. this is how dcraw/RT are tuned
to work and even if this could be a compromise* it's better .. at least because we
want linear data and we need linear data to apply WB multipliers and matrix color transformations
* May be we could gain something regarding noise at the very dark regions if we use
"below black" info (I think no gain for typical foveon data due to much read noise
but could hepl for better stacking of multiple frames i.e astrophoto ..) but for this
Dcraw/RT would need to be adapted i.e. at least use different black level for the raw
conversion vs linear color transforms.

For white levels .. aren't x3f data 12bit ? how can we have values higher than 4095
? Are you sure that the 6700 value comes from not scaled data ?.
For the clipping at highlights RT vs DCraw .. does dcraw have correct colors/WB when
not clipping ?. Most times by applying WB and color trasforms some channels get amplified
and parts that are not clipped at raw data get clipped after color corrections .. this
is normal .. we live with this from the start of the digital photo era :(

Reported by iliasgiarimis on 2015-04-09 14:38:25

Beep6581 commented 8 years ago
Well, for me it has been a long day also... so nothing new for today. We are not in
a hurry though, so no problem :)

Sorry, I will try to get new samples once the weather helps! Today it was raining.
I hoped using a teleobjective would help, and it did help, but apparently the problem
was not eliminated :(

I won't send the SD1M for shutter repairs until you are satisfied with the samples.
The shutter works well under 1/2000, so we can live with it...

borissimo86 (in the forum) also will try to get some new samples for the SD1M.

If it makes things easier, I see no problem in using the average of a black frame if
it can be overridden by the user with the "Raw Black Points" control, like now with
a preset value. I noticed this is not the case when one uses a dark frame... but with
a calculated value, I presume the "Raw Black Points" control will work ok, right? Sometimes,
for example when you average multiple frames, it helps a lot to get data below the
noise level :) And that's one thing I love in rawtherapee: It doesn't limit you artificially.
Like they say in programming, you can shoot yourself in the foot, if you know what
you are doing :)

The converter used in the SD14 is the AD9228 quad 12bit 40/65MSps ADC, but dcraw, and
other X3 readers, output data from 0-6900. This are the values that come from the camera
in totally raw mode, and they have no holes or weird statistics, so the camera operates
as a 12.75 bit device somehow! Maybe double sampling? You can check for yourself those
are the values rawtherapee is indeed dealing. It's not imagination :)

The SD1/M outputs from 0-4096, which makes more sense. But for example, the SD10 went
as high as 13000 IIRC (I can't find the samples any more, and I never had that camera).
I don't know how they did it, but maybe multiple sampling? From the user point of view
the SD10 was 13.7 bits and the SD14 is 12.75 bits. Perhaps it's magic :)

To get totally raw data:

dcraw -4 -D -M SDIM8210.X3F

A while ago I also tried with Proxel X3F Tools, which uses a completely independent
code set, and the values are exactly the same, so I must assume both authors are not
dumb.

DCraw has lousy colors for all sigma cameras. RawTherapee has excellent colors already,
with the ICM profiles. And reasonable colors with the matrices. Quite frankly, I'm
not concerned with the colors: under tungsten with the ICC profile from my sample the
colors are MUCH better than from Sigma Photo Pro! Under sunlight they are as good as
SPP. Except for the color casts in the borders, but that might be solvable with flat
grey frames, so I wouldn't be concerned with that for now.

Cheers, and have a good trip!

Reported by rnbc.r0 on 2015-04-10 00:10:56

Beep6581 commented 8 years ago
I think that even dcraw -D does not export pure raw with foveon .. or Sigma manipulates
the vaues before writing to x3f (channel scaling) as we can see in SD14 the WB multipliers
are somehow normalized (instead of with Merrills where we have the expected very strong
blue (top layer) and very weak red (bottom layer).

I cannot do any work for the next days but I am thinking about the periphery casts
.. and now (after some tsipuro :) :) ) I think that this is much more complicated to
solve with a flat frame .. lets first try but I feel we are trying to solve a non linear
distortion (sensor's angular responce) corner or left/right sides) will help. I am
thinking of it as not a simpe WB shift but a color space shift which needs it's own
color correction matrix .. 

It's a miracle if we came at same/better than RPP color by just guessing how things
work !!! 

Reported by iliasgiarimis on 2015-04-10 09:45:43

Beep6581 commented 8 years ago
Hello, sorry for the delay. This kind of thinking is better done slow :)

-- The white balance multipliers:

All old sigma's cameras were optimized for Daylight, and had very lousy performance
under tungsten. I think they modified the sensor response into the blue to optimize
tungsten response a bit, at the cost of a bit more noise under daylight. I might be
wrong... but I think this was a hardware (sensor) modification, not something done
in software.

-- Color cast:

I fully agree with your assessment of the color cast problem, as you can derive from
my initial comments on the subject in the forum. But I'm not as pessimistic in practice
:)

As a matter of fact the problem cannot be solved completely in software because the
spectral response of each channel changes with light admittance angles. The difference
cannot be solved completely with a data transformation because you don't have access
to the response curve: your response curve only has 3 samples (R,G,B). Only a imaging
spectrometer with tens of channels could solve this problem completely in software,
because the spectral sampling would be sufficiently complete for the correction to
take place afterwards. And in that case, changing gains for each channel would be enough,
which is essentially what we are doing with a flat frame: changing each channel gain
depending on the region.

Now, down to the ground: even Sigma's SPP suffers a bit from this problem, and they
have data for all lenses ever made for this camera. And don't even try to use a third-party
lens with SPP! I tried with a few pentaxes: the color casts are horrible... because
the lenses have a small distance to the exit pupil, and have no corrections in software,
obviously. RawTherapee will probably work better with third party lenses than SPP does.

So, let's advance a step at a time. With many modern (telecentric) lenses the color
casts are not horrible. Also, we don't need perfection, right? Let's be optimistic
;)

-- Values in the RAW data:

As far as I know the values dcraw and x3f tools export are correct. When I did a statistical
analysis of the sensor in the SD14 I (and others...) found no traces of digital amplification.
It's perfectly possible to use a 12 bit ADC to output 14 bit samples. Philips did this
in their first audio samplers (14->16), using oversampling. Damn... you can even use
1 bit ADC to output the equivalent of 20 bits! Think Super Audio CD, for example.

I have digged a bit in my archives and confirmed yesterday that the old SD10 (and SD9?)
outputs above 9900, but I have no full saturated samples... so I can't confirm how
far it goes.

-- How to calculate Black Point?

Considering we are dealing with an already partially truncated normal distribution
(from the cameras), and we want to cut it at the mid point, the point where the theoretical
zero was without noise, to ensure linearity,  it's better to use the Median (the Quantile
1/2), not the Average (Arithmetic Mean), because the Median is not affected by prior
partial truncation, and the Average is.

If the distribution is not completely symmetrical (perhaps because it's not a Normal?...)
it might be better to use the Mode (the value that appears most often), not the Median
of the Mean, but some truncations affect the Mode, so I would start with the Median.

This considerations applies to any other sensor as well, not just Foveon. So if RawTherapee
is using the Average for this, somewhere, it should be changed. I can try to make the
code changes myself, if you point me to the place.

PS: Aren't the guys at RRP also guessing? The only ones not guessing are the camera
manufacturers, right?... And they don't support Foveon, anyway, perhaps because it's
hard :D

Cheers!

Reported by rnbc.r0 on 2015-04-10 22:41:54

Beep6581 commented 8 years ago
Ahh, I meant SPP (not RPP) answering  to your last paragraph at #66

For Black levels caculation I also believe that median is better than average, except
if we remove outliers before, or use a more robust statistic (i.e. trimmean).

For white level we will need some samples of greyscale near saturation. Meanwhile it's
easy to change it in camconst.json and test it (this is the primary reason camconst.json
exists)

For the casts .. exactly as you wrote, an easy first approach is the flat (i.e. variable
channel scaling) but I think it will not be enough .. the way the responce changes
is not simly different weighting .. a second approach is to mix color matrices/profiles
.. then we will see ..

Reported by iliasgiarimis on 2015-04-11 08:47:11

Beep6581 commented 8 years ago
For white point I tend to go with photos of lamps, because they contain an entire gradient
than includes saturated pixels. I already posted one for the SD1M and the SD14. I'll
post some more if needed.

I know the ideal is multiple exposures of grey gradients near saturation, because it
allows to calculate the sensor response curve using HDR algorithms. But we don't need
perfection...

The trimmed mean function suffers from the same problems of the median: From wikipedia,
"The truncated mean uses more information from the distribution or sample than the
median, but unless the underlying distribution is symmetric, the truncated mean of
a sample is unlikely to produce an unbiased estimator for either the mean or the median."
So we might as well go with the median... and if that fails (it won't), the mode.

Sigma's SPP is far from ideal in the shadows... quite frankly the rendering sucks in
shadow areas, and always sucked. For the SD14 Silkypix gets much better results! Unfortunately
they never supported the newer sensors. Also SPP is notorious for bad red rendering
with the SD14: tends to render reds as magenta! So RT is competing with something that
is far from ideal. As it is RawTherapee is already getting reasonable or even good
results!

Reported by rnbc.r0 on 2015-04-12 00:42:15

Beep6581 commented 8 years ago
#68 @mbc

"I have digged a bit in my archives and confirmed yesterday that the old SD10 (and
SD9?) outputs above 9900, but I have no full saturated samples... so I can't confirm
how far it goes."

I have a good few highly exposed SD9,SD10 and SD14 shots.

I can offer RawDigger histograms if there is any interest?

Ted

Reported by xpatUSA on 2015-04-14 17:47:39

Beep6581 commented 8 years ago
Hi Ted,

yes, please provide histograms and if possible samples of black frames, totally overexposed
frames (all pixels clipped at all layers) or something close to this, and  reference
shots of cc24 or SG targets. All are valuable :)

Reported by iliasgiarimis on 2015-04-14 17:52:02

Beep6581 commented 8 years ago
Hi Ilias,

Can provide black frame for the SD9 also over-exposed frame no problem.

Just received a SD10 but already took the hot filter out for IR purposes. Should still
be able to provide frames though.

Will provide RawDigger histograms too.

Reluctant to provide CC24 image because I don't have a perfect D65 lamp for sRGB or
D50 lamp for Lab - so I'm not sure that such images are really useful. Can however
provide X3F metadata dumps which show each camera's "cam to XYZ_flash" 3x3 matrix and
the diagonal matrix XYZ corrections for sunlight, etc.

Ted

Reported by xpatUSA on 2015-04-14 21:32:37

Beep6581 commented 8 years ago
Yes, and send the X3Fs, that would be ideal! With the samples it's easy to see how far
they go :)

Maybe RawDigger decodes the values in the X3F in some different way? Now, that would
be interesting! RawTherapee uses the values as decoded by dcraw and x3ftools.

Reported by rnbc.r0 on 2015-04-14 22:49:24

Beep6581 commented 8 years ago
Ref: @mbc #76

 #75 rnbc...@gmail.com

Yes, and send the X3Fs, that would be ideal! With the samples it's easy to see how
far they go :)

"Maybe RawDigger decodes the values in the X3F in some different way?" 

"Decodes"? The X3F values are written to three channels as words (16-bit). RawDigger
displays X#F raw data as three channels in 16-bit format.

"Now, that would be interesting!"

That would cause me to remove the Application from my computer!!

Do you guys have a copy of the FOVb X3F file format document?

I take it that you don't have RawDigger.

"RawTherapee uses the values as decoded by dcraw and x3ftools."

Perhaps I am misunderstanding this use of the word "de-coding".

later,

Ted

Reported by xpatUSA on 2015-04-15 00:05:21

Beep6581 commented 8 years ago
Here's some SD9 shots to look at with their meta-data and histograms:

http://kronometric.org/phot/xfer/Issue%202729/

How do you like that 20,000 or so? Surprised me too . . .

Reported by xpatUSA on 2015-04-15 00:45:07

Beep6581 commented 8 years ago
Ted, the cc24 shot we need should better be under midday sunlight, this is very close
to 5000K-5500K these days .. :)
No real need for 6500K. But useful are 2-3 shots under tungsten (I mean real tungsten
or halogen 2800-3000K, not warm fluorescent) as foveon's response is much different
there.
Please use the middle third of the frame and expose as perfectly as you can (shoot
-1/3, 0, +1/3)  

Reported by iliasgiarimis on 2015-04-15 07:36:10

Beep6581 commented 8 years ago
About the coding/decoding:

For the SD14 and SD9 (maybe some others too) the values are coded (in the X3F) as 10
bit values per channel (perhaps also with some compression, didn't analyze foveon_decoder()
yet.
These 10 bit values are then used to pick into a lookup table of differences (difference
between actual point and preceding point) to calculate the final value. This lookup
table is also in X3F file.

Here's an example of the lookup table content:

-16384  -16095  -16009  -15922  -15836  -15750  -15667  -15581  -15499  -15413  -15331
 -15246  -15164  -15082  -15001  -14919  -14838  -14756 ...............
.
.
.
..............  15001   15082   15164   15246   15331   15413   15499   15581   15667
  15750   15836   15922   16009   16384

Have a look at the following dcraw code (formated for better readability, load_flags
is zero for SD9 and SD14, short diff[1024] is the lookup table):

void CLASS foveon_sd_load_raw()
{
    struct decode *dindex;
    short diff[1024];
    unsigned bitbuf=0;
    int pred[3], row, col, bit=-1, c, i;

    read_shorts ((ushort *) diff, 1024);
    if (!load_flags)
        foveon_decoder (1024, 0);

    for (row=0; row < height; row++) {
        memset (pred, 0, sizeof pred);
        if (!bit && !load_flags && atoi(model+2) < 14)
            get4();
        for (col=bit=0; col < width; col++) {
            if (load_flags) {
                bitbuf = get4();
                FORC3 pred[2-c] += diff[bitbuf >> c*10 & 0x3ff];
            } else FORC3 {
                for (dindex=first_decode; dindex->branch[0]; ) {
                    if ((bit = (bit-1) & 31) == 31)
                        for (i=0; i < 4; i++)
                            bitbuf = (bitbuf << 8) + fgetc(ifp);
                    dindex = dindex->branch[bitbuf >> bit & 1];
                }
                pred[c] += diff[dindex->leaf];
                if (pred[c] >> 16 && ~pred[c] >> 16)
                    derror();
            }
            FORC3 image[row*width+col][c] = pred[c] < 0 ? 0 : pred[c];
        }
    }
}

Ingo

Reported by heckflosse@i-weyrich.de on 2015-04-15 12:29:23

Beep6581 commented 8 years ago
Addition: the lookup table in #79 was from a SD14 file.
Here's one from a SD9 file:

-16384  -7882   -7851   -7820   -7789   -7758   -7727   -7696   -7665   -7634   -7603
  -7572   -7542   -7513   -7484   -7455
.....
.....
7484    7513    7542    7572    7603    7634    7665    7696    7727    7758    7789
   7820    7851    7882    16384   16384

Reported by heckflosse@i-weyrich.de on 2015-04-15 13:14:41

Beep6581 commented 8 years ago
Thanks gents,

I'll try the CC shots with the SD9. The SD10 is not currently available (hot mirror
removed). The stops on the SD9 are 1/2 EV, not 1/3. I have one true incandescent bulb
or I have some Long-life GE Halogen floods, CCT unknown - which would you prefer?

Rain forecast daily until Monday - would 5500K flash be OK?

As to the "decoding", now I understand that the ADC outputs are written as 10-bit and
I have read somewhere that loss-less compression is used. Perhaps like the Nikon D50's
compressed NEF, which has 0-4095 values in and 0-683 levels out?

You all probably know that RawDigger is written by Iliah Borg & Co and, as far as I
know, the Sigma decoding is based on dcraw routines - see libraw.com

Sorry, I'm not familiar with C syntax. Last serious programming for me was in FORTH-83
and Mac 68000 Assembly many years ago. Since then, I have dabbled with JavaScript but
not to any great extent and I took all JavaScript out of my website years ago.

Ted

Reported by xpatUSA on 2015-04-15 14:38:54

Beep6581 commented 8 years ago
Just for the sake of completeness here the complete lookup tables from SD14 and SD9

Ingo

Reported by heckflosse@i-weyrich.de on 2015-04-15 15:06:29


Beep6581 commented 8 years ago
Here's the SD9 CC24 card shots. All taken with WB set to flash (by mistake) but makes
no different to RAW data. I assume you have SPP where the WB can be set appropriately
before exporting?

I will try attaching this time instead of link to my website . . .

. . . oops 10MB max not enough . . please wait . . . and wait . . and wait . . . .

OK here we are:

http://kronometric.org/phot/xfer/Issue%202729/SD9/CC24/

took 35 minutes over my satellite link :-(

Ted

Reported by xpatUSA on 2015-04-15 17:31:40

Beep6581 commented 8 years ago
Thanks for the samples!

The maximum value I see in those samples is 9550 in the red channel in lamp+0.5EV.X3F
and 8900 in sun+0.5EV.X3F

I understand you have a slow link... but can you provide some samples overexposed say,
2 stops, 4 stops, and 6 stops? Sunlight only is ok, I think!

And again, many thanks for the SD9 samples, that's a RARE camera indeed, keep it well!
:)

Reported by rnbc.r0 on 2015-04-16 00:49:03

Beep6581 commented 8 years ago
Ingo wrote:

"Thanks for the samples!

The maximum value I see in those samples is 9550 in the red channel in lamp+0.5EV.X3F
and 8900 in sun+0.5EV.X3F

I understand you have a slow link... but can you provide some samples overexposed say,
2 stops, 4 stops, and 6 stops? Sunlight only is ok, I think!"

I don't understand. I already provided an over-exposed white frame. How will over-exposing
CC24 shots help?

Ted

Reported by xpatUSA on 2015-04-16 04:07:03

Beep6581 commented 8 years ago
re #85 "Ingo wrote"

I didn't write that.

Ingo

Reported by heckflosse@i-weyrich.de on 2015-04-16 10:35:50

Beep6581 commented 8 years ago
Analyzing very overexposed frames is the only way to check the sensor saturation point...

Or you can just point at the sun with a wide angle lens (about 20mm), f/11, 1/125s,
and take a picture. The sun will be saturated in the sensor and the pixels around it
will show a gradient up to saturation. Not as scientific, but works in practice :)

Reported by rnbc.r0 on 2015-04-16 12:11:58

Beep6581 commented 8 years ago
re #85 Ingo wrote

"I didn't write that".

Sorry,it was mbc. I'm still getting used to who's who here :-(

" #87 rnbc...@gmail.com

Analyzing very overexposed frames is the only way to check the sensor saturation point...

Or you can just point at the sun with a wide angle lens (about 20mm), f/11, 1/125s,
and take a picture. The sun will be saturated in the sensor and the pixels around it
will show a gradient up to saturation. Not as scientific, but works in practice :)"

Thank you for the education :-(

I already sent an over-exposed frame and a black frame, a few days ago. I've re-uploaded
them and re-named them so it is obvious what they are.

See: http://kronometric.org/phot/xfer/Issue%202729/SD9/

Please let me know if they are unsatifactory for some reason.

Ted

Reported by xpatUSA on 2015-04-16 13:48:46

Beep6581 commented 8 years ago
Here's some SD10 frames and histograms:

http://kronometric.org/phot/xfer/Issue%202729/SD10/

No sun forecast today, so CC24 shots will be later this week.

They will be at 0 EV bracketed +/- 1/3 EV, not over-exposed.

Ted

Reported by xpatUSA on 2015-04-16 18:58:31

Beep6581 commented 8 years ago
re #88, "Sorry....I'm still getting used to who's who here :-("

Ted, no problem ;-)

Ingo

Reported by heckflosse@i-weyrich.de on 2015-04-16 20:03:00

Beep6581 commented 8 years ago
So from lightFrame.X3F it seems the SD9 sensor saturates at >16200, or 16000 to be sure
we catch white point.

Also the first black pixels around the border can't be used, because they get some
light. Only the ones closer to the border (but perhaps not too close also?).

Now, the SD10 sensor is the same as the SD9, except for microlenses, but I'm not sure
about the electronics. Perhaps the noise is different, and the white point also might
be.

Anyone cares to send a dark frame from a SD10 and also a completely saturated white
frame?

The SD15 is another beast also, because despite being the same chip as the SD14 there
is a variable amplifier fitted, so the output changes with the ISO.

Reported by rnbc.r0 on 2015-04-17 01:06:36

Beep6581 commented 8 years ago
"Now, the SD10 sensor is the same as the SD9, except for microlenses, but I'm not sure
about the electronics. Perhaps the noise is different, and the white point also might
be.

Anyone cares to send a dark frame from a SD10 and also a completely saturated white
frame?"

See comment #89

Reported by xpatUSA on 2015-04-17 02:55:40

Beep6581 commented 8 years ago
I committed the last patch to revision d80d58896deb to get more tests.
Issue stays open.

Ingo

Reported by heckflosse@i-weyrich.de on 2015-04-17 14:29:35

Beep6581 commented 8 years ago
Re: SD10 shots

The sun came out briefly today, so here's some CC24 shots, including Incandescent:

http://kronometric.org/phot/xfer/Issue%202729/SD10/CC24/

Time was approximately 10:30 hrs, Local Solar Time. Clear sky.

Ted

Reported by xpatUSA on 2015-04-17 17:37:46

Beep6581 commented 8 years ago
I am less pessimistic about a possible solution for the color shift issue because of
off angle rays. I do not know of the microlens is atched to the sensor with a transparent
glue or if there is air gap between the bottom surface of the microlens and silicon
surface. Also the flow glass is not as even because the metal layer density would much
differren in the sensor area. From a pure geometric standpoint one can state that an
off angle ray have a 1/cos(p) longer path to generate carriers. So a different angle
would generate a color sensitivity distribution with a red shift. So the color sensitivity
curves will be shifted to the right sensitivity over wavelength diagram. So w/o off-angle
measurement if there is a useable curve an off-angle curve could be guessed. The effect
is a nonlinear effect over angle but linear in the color transformation. If the nonlinear
effect over angle is low a possible linear interpolation for each matrix entry will
do the job. The lens/apture variation could also be calculated.

I am interessed to support the Foveon support because it will open Foveon users to
application which are not forseen and closed by the SPP owners. I h toave 3 SD14, 2
SD15 and 2 SD1M also with different FW and great number of lenses. I have ordered 2
ColorCharts which will arrive next week to place them in the center and at the corner.
I think the proposed grey stripe from the center to corner does not work because the
off-angle color shift affect all 9 entries of the matrix. So for correct testing could
need a number of checkers along the axis from the center to the corner. A simply grey
value does not provide the enough dimensionality to make a least square test for color
accuracy after e.g. bilinear interpolated color transformation matrix. The interpolation
range should depend on lens type and aperture but this could be possible simplified
to one number to reduce further complexity.

I think it is better that I am prepared for the test to minimize the number of files.
So I need your help for the test setup. First I will be limited to sunlight and a 1EV
stepping from -14EV to -2EV then 1/3EV stepping to +2EV relative to the guessed clipping.
Do you think FW is important? Which type of lens range is the best? You can inspect
the cams and lenses on my dpreview profile (rf-design).

Reiner

Reported by reiner@rf-design.de on 2015-04-17 19:43:51

Beep6581 commented 8 years ago
Hi again, the SD10LightFrame.ppm doesn't seem fully saturated. Can you repeat the shot
with higher exposure? There are zones where the higher values are around 8900, but
I know the saturation point of the SD10 is above 9900.

Just photograph the sun, for example, as I said previously. With a wide angle lens
at f/11 or so, it won't harm the sensor in any way.

Reported by rnbc.r0 on 2015-04-17 19:53:08

Beep6581 commented 8 years ago
re #97, reiner: Please wait until Ilias is back. He's the best to answer your questions
:-)

Ingo

Reported by heckflosse@i-weyrich.de on 2015-04-17 20:53:37

Beep6581 commented 8 years ago
Re: rnbc #98

"Hi again, the SD10LightFrame.ppm doesn't seem fully saturated."

Hello.

Well, I did not post a file "SD10LightFrame.ppm". I posted an X3F which according to
RawDigger is fully saturated up around 10,000. Did you not see this image?

http://kronometric.org/phot/xfer/Issue%202729/SD10/SD10LightFrameRawHistogram.png

"Can you repeat the shot with higher exposure? There are zones where the higher values
are around 8900, but I know the saturation point of the SD10 is above 9900.

Just photograph the sun, for example, as I said previously. With a wide angle lens
at f/11 or so, it won't harm the sensor in any way."

The shot was taken with the lens hard up against my monitor at f/2.8 and a 2 sec exposure
time. I doubt that I can expose the sensor much harder than that.

Having done my shots for this topic, my camera is converted back to IR duties, so no
more SD10 shots from me, sorry.

Ted.

Reported by xpatUSA on 2015-04-17 22:33:46

Beep6581 commented 8 years ago
Well, you're right: unless your monitor is very dim f/2.8 and 2 seconds should saturate...

Anyway, having the camera without the infrared filter is not a problem for this kind
of samples. We just want to know how far the sensor goes before it saturates. So it's
irrelevant if the colors are fine or... whatever :)

That means I would put the white point at around 8500 or so, just to be sure. The white
point is different in different zones... oh well, I'm puzzled :P

Reported by rnbc.r0 on 2015-04-18 01:50:13

Beep6581 commented 8 years ago
re: #102 rnbc:

"The white point is different in different zones... oh well, I'm puzzled :P"

I didn't understand that comment. Could you explain, please?

Ted

Reported by xpatUSA on 2015-04-19 20:08:20

Beep6581 commented 8 years ago
I mean the sensor in your SD10 appears to saturate at different levels in different
points.

Reported by rnbc.r0 on 2015-04-20 02:04:24

Beep6581 commented 8 years ago
Is there a calculation or number which state the difference in illumation of sunlight
on a surface with 100% reflectance which 0-degree orientation to the sun and the direct
sunlight? I simply want to know in terms of EV what is the difference between direct
and reflected sunlight.

Reported by reiner@rf-design.de on 2015-04-20 10:53:20

Beep6581 commented 8 years ago
Re: #105

"Is there a calculation or number which state the difference in illumation of sunlight
on a surface with 100% reflectance which 0-degree orientation to the sun and the direct
sunlight? I simply want to know in terms of EV what is the difference between direct
and reflected sunlight."

In this simple case, the difference is 0 EV. The case itself is rare, however. An example,
if I understand the question correctly, would be a very large mirror close to the means
of measuring illuminance.

This may be of help:

http://kronometric.org/phot/lighting/lighting%20handbook.pdf

Ted

Reported by xpatUSA on 2015-04-20 12:38:05

Beep6581 commented 8 years ago
This patch adds .badpixels file support for foveon. The offset for the foveon coordinates
is 0, so you can just use the coordinates as reported in rt gui.

Ingo

Reported by heckflosse@i-weyrich.de on 2015-04-21 12:43:23

Beep6581 commented 8 years ago
Small correction to last patch...

Reported by heckflosse@i-weyrich.de on 2015-04-21 12:49:54


Beep6581 commented 8 years ago
Not related to foveon sensors, but as I was working at this part of code, I added .badpixels
file support for monochrome sensors (Leica Monochrome for example).

Ingo

Reported by heckflosse@i-weyrich.de on 2015-04-21 13:26:21