ColSebastianMoran
( IRL Richard Karash )
How to judge ETTR in Camera-Scanning? Generally, I want some ETTR on cam-scan of color-neg to avoid noise in the blue channel. Typically, that's +1EV with in-camera meter or TTL flash.
Yesterday, cam-scanning a roll of Fuji color negative film, I see the on-camera histogram blinking a warning. Red clipping! OMG! Back off the exposure, right?
No... wrong. I checked the raw data in RAWDigger, there is no clipping. So I started some experiments. With electronic flash exposure, a series from -2 to +2EV. Hmm... Despite the orange mask, the red channel is not the highest, green is highest. Then a little research. Each channel of raw data has a multiplier to convert to jpg for the on-camera histogram. R and B multipliers seem always to be higher or much higher than for G.
Hypothesis for judging ETTR:
- Green channel is always highest in the raw histogram (in the raw data, not the on-camera display).
- In-camera conversion to jpg for the on-camera histogram has 1.0 multiplier for G, higher multipliers for R and B.
- Often, on-camera hist shows R clipping when raw data is fine.
- Therefore, judge ETTR just by looking at G channel, ignore R.
- And, the WB set in camera doesn't matter, just look at G channel.
Thoughts?
If you have tools for examining raw histogram, do you agree that G is always highest? Anybody know how these multipliers work? Is G multiplier always 1.0? Does the on-camera histogram always reflect the raw data for the G channel? Is this crazy?
On-Camera historgram for a shot at +2EV. Note that R show wicked clipping, G just below clipping, black in image is blinking:
Raw data histogram for same shot, +2EV, from RawDigger, max value for this camera is 16,000, G channel is right at edge, not clipping:
Note: I am NOT suggesting +2EV, this is too close to clipping in G.
Yesterday, cam-scanning a roll of Fuji color negative film, I see the on-camera histogram blinking a warning. Red clipping! OMG! Back off the exposure, right?
No... wrong. I checked the raw data in RAWDigger, there is no clipping. So I started some experiments. With electronic flash exposure, a series from -2 to +2EV. Hmm... Despite the orange mask, the red channel is not the highest, green is highest. Then a little research. Each channel of raw data has a multiplier to convert to jpg for the on-camera histogram. R and B multipliers seem always to be higher or much higher than for G.
Hypothesis for judging ETTR:
- Green channel is always highest in the raw histogram (in the raw data, not the on-camera display).
- In-camera conversion to jpg for the on-camera histogram has 1.0 multiplier for G, higher multipliers for R and B.
- Often, on-camera hist shows R clipping when raw data is fine.
- Therefore, judge ETTR just by looking at G channel, ignore R.
- And, the WB set in camera doesn't matter, just look at G channel.
Thoughts?
If you have tools for examining raw histogram, do you agree that G is always highest? Anybody know how these multipliers work? Is G multiplier always 1.0? Does the on-camera histogram always reflect the raw data for the G channel? Is this crazy?
On-Camera historgram for a shot at +2EV. Note that R show wicked clipping, G just below clipping, black in image is blinking:

Raw data histogram for same shot, +2EV, from RawDigger, max value for this camera is 16,000, G channel is right at edge, not clipping:

Note: I am NOT suggesting +2EV, this is too close to clipping in G.