Hatch
Established
http://pro.sony.com/bbsccms/ext/BroadcastandBusiness/ccdcmosimagers.shtml
A nice and simple (lengthy though) explanation of sensor technology.
He touches on diffraction at the end.
Although it focuses on movies, it applies to stills as well of course.
A nice and simple (lengthy though) explanation of sensor technology.
He touches on diffraction at the end.
Although it focuses on movies, it applies to stills as well of course.
semilog
curmudgeonly optimist
Meanwhile: I think one should start with the simplifying assumption of unity quantum efficiency in the detector (sensor) and consider the noise problem from there.
I'm not sure we have any disagreement at all, actually.
Do we?
I think that each of these questions can be dealt with separately, yes. But setting aside QE doesn't mean that it's not central to the overall question of sensor performance.
1. Since the shot noise component of the total device SNR decreases as the number of detected events increases, it does not really matter HOW you increase the number of events. It can be done by making a bigger photosite (larger sensor pitch, use of microlenses), or it can be done by increasing QE of the photosensitive site itself.
2. The second major noise component – read noise – is a function of charge transfer efficiency (especially in a CCD), preamp quality, ADC ciruitry etc.
3. The third major noise component, dark current, is a lot less important at exposures shorter than say 1/50 s. The best way to get rid of dark current is to cool the sensor to below ambient temperature.
In the example that I linked, all three components are dealt with. Shot noise is dealt with by using large pixels and having ~95% QE. Read noise is dealt with by the EM-CCD sensor design, which brings us to <1 e- per pixel per read. Dark current (and in these cameras also read noise and charge transfer losses) are minimized by running the camera at about -80° C.
The examples from low-light scientific imaging are absolutely relevant to more conventional imaging cameras because they show what can be achieved and how that's done. Smaller pixels, as you indicate, are smaller photon buckets. That's precisely why it's essential to suppress read noise and maximize QE in these cameras.
Finally, for non-quantitative purposes such as imaging, it can (under at least some circumstances) be advantageous to have more, noisier pixels, n=and then to use binning or more sophisticated spatial correlation functions to deal with noise algorithmically. By doing all of these things at once, it's possible to, for example, make the sensor in the two-generation-old low-end consumer Pentax K-x performa at a level that rivals the Nikon D700, with its much larger pixels [link].
The post that I was responding to argued that bigger sensors don't benefit from back-thinning (front-side illumination). In my view, they clearly can benefit from this technology, and I explained why.
Last edited:
semilog
curmudgeonly optimist
http://pro.sony.com/bbsccms/ext/BroadcastandBusiness/ccdcmosimagers.shtml
A nice and simple (lengthy though) explanation of sensor technology.
He touches on diffraction at the end.
Although it focuses on movies, it applies to stills as well of course.
Really good.
For those who prefer text to video, there are some superb primers here:
CCD | CMOS
hlockwood
Well-known
I'm not sure we have any disagreement at all, actually.
Do we?
SNIP
I think we can put the issue to rest, especially since it has wandered pretty far OT.
Harry
Hatch
Established
Thanks for the links.
Interesting topic allround.
It's been a while since I was fluent in Quantum mechanics so it's a nice way to delve into my memory and see if I can drag it up.
Jim Evidon
Jim
Formula-shmormula!
Formula-shmormula!
Formula-shmormula!
The NEX 7 is using a new sensor. It is probably the same one that is coming out in the new A77. It is not out yet notwithstanding DP Review publishing some quick and dirty (just a figure of speech-no deprecation intended)
test shots with a pre production model.
If one was able to predict performance based on existing hardware by applying a math formula, that would be wonderful. But one cannot.
For example; my 4/3 Lumix GF1 blows away my Nikon D300 at low to moderate OSO settings and that is with a smaller sensor. The reason: later generation technology.
I suggest that we all be patient and wait for the actual product to come on the market for some serious evaluation.
Formula-shmormula!
Formula-shmormula!
The NEX 7 is using a new sensor. It is probably the same one that is coming out in the new A77. It is not out yet notwithstanding DP Review publishing some quick and dirty (just a figure of speech-no deprecation intended)
test shots with a pre production model.
If one was able to predict performance based on existing hardware by applying a math formula, that would be wonderful. But one cannot.
For example; my 4/3 Lumix GF1 blows away my Nikon D300 at low to moderate OSO settings and that is with a smaller sensor. The reason: later generation technology.
I suggest that we all be patient and wait for the actual product to come on the market for some serious evaluation.
John Robertson
Well-known
Pre-test of Nex 7 and A 77 in this weeks (UK)Amateur Photographer .
semilog
curmudgeonly optimist
If one was able to predict performance based on existing hardware by applying a math formula, that would be wonderful. But one cannot.
:bang: What, exactly, do you think engineers do for a living?
Last edited:
vidgamer
Established
That might be the claim, but in my real-world experience it is wrong, wrong, wrong.
I want to say that Sony made that claim... basically, that they can achieve 2x sensitivity with the small 1/2.5" sensors, but with larger sensors, they would not see that kind of increase in sensitivity.
Something else I thought of -- some have been suggesting that backlit sensors may cope better with pancake and RF lenses -- or lenses that aren't retrofocus designs. Most sensors have problems with the more oblique rays.
So, if there were no costs to it, perhaps it is indeed better, even for general photography. But I wouldn't expect it to be a silver bullet that will give APS-C cameras a 2x sensitivity boost like it does for the iPhone 4 or small P&S cameras.
...
Essentially all of the best cameras for astronomy and optical microscopy are back-thinned, and they generally have larger photosites. But the sensors are both mechanically fragile and rather expensive. The 512 x 512 (0.25 Mpix) camera linked to above is in the $30,000 range.
I don't think that'd sell well as a general purpose photography too.
There's probably more to it than just being a backlit sensor.
Share:
-
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.