bwcolor
Veteran
DXO is rubbish. They report the zeiss distaon ZE 21mm f2.8 as one of the lowest resolving lenses they've tested for canon EOS (when in fact it's probably one of - if not THE highest resolving lens available for EOS). Same goes with other zeiss's tested...
They're just plain wrong.
Good... The Zeiss 21mm f/4.5C and 35mm F/2.8C along with the Zeiss 100mm f/2.8 Macro are the lenses that I intend to use with the NEX-7, should reviews and the EVF pan-out.
Last edited:
bugmenot
Well-known
DxOMark reported that my favorite lens, the Zeiss 35mm f/2.8 compact didn't shine .
There is no mention of this lens on their site.
umcelinho
Marcelo
Sam N
Well-known
Interesting to note:
-12mp m43 cameras and 18mp Canon DSLRs (7D, T2i, etc.) have basically the same pixel density.
-16mp m43 cameras (G3, GH2) have very similar pixel density to the new Sony 24mp APS-C sensor.
Anyway, what's wrong with being diffraction limited? A lens that's sharp at F8 will be no worse on a 24mp sensor than a 12 or 14 or 16 or 18mp sensor.
-12mp m43 cameras and 18mp Canon DSLRs (7D, T2i, etc.) have basically the same pixel density.
-16mp m43 cameras (G3, GH2) have very similar pixel density to the new Sony 24mp APS-C sensor.
Anyway, what's wrong with being diffraction limited? A lens that's sharp at F8 will be no worse on a 24mp sensor than a 12 or 14 or 16 or 18mp sensor.
Avotius
Some guy
I would be skeptical of any info that came from the dpreview forums, that is not a bright bunch there.
sevo
Fokutorendaburando
Anyway, what's wrong with being diffraction limited? A lens that's sharp at F8 will be no worse on a 24mp sensor than a 12 or 14 or 16 or 18mp sensor.
It will be no worse. But it will be no better either, at which point the larger size and/or higher pitch of the sensor are wasted.
The whole thing is a old problem in photography - to gain in quality you have to step up disproportionally rather than linearly in size, as you have to design for a lower resolution (larger CoC) to make up for escalating diffraction. Hence large (and even medium) formats are very much (rather than only slightly) larger than small format...
Inversely, going smaller with the film size gained higher resolution throughout the period where new discoveries in chemistry made proportional improvements to the film resolution possible - that pushed 35mm into the mainstream format.
In the long run it means that there will be a inevitable trend towards larger sensors (and a revival of digital medium and eventually large format) in the pro market once the FF sensor pitch hits the optical limits (scheduled to happen at the next major pro DSLR release cycle) - Leica have wisely already got into that camp.
APS-C has already pretty much hit its limits, what you buy now will not be significantly outdated in terms of sensor specs any more.
Matus
Well-known
I see that I am not the only one who is not comfortable with the way the diffraction limit of 8x10 print was obtained. So let me dig a bit into that.
To be able to estimate at which point the diffraction becomes visible I (based on THIS online calculator) decided that once the circle of confusion (CoC) reaches size of two pixels we hit the diffraction limit (see the above link for details).
First of all - let's check at which f/stop the output from given sesor will be diffraction limited. This is easy to follow, as the only variable here is the pixel size. As the comparison above was between X100 and NEX-7 let's have a look at these two:
X100:
pixel size: 5.9 micro meter (um)
diffraction limited at: ~ f/8
COC ~ 12 um
NEX-7
pixel size: 4.2 um
diffraction limit at: ~ f/4
COC ~ 8um
So far so good. NEX-7 has smaller pixels so it gets diffraction limited at larger apertures - no surprise here.
Now however we move to the prints. So we want to make a 8x10" print, sa at 300 dpi. For that we need at least 7.2 Mpixels and both cameras offer more than that. In other words - we need at least 7.2 effective Mpixels to get sharp 8x10" print. So we can just compute the size of the circle of confusion for diffraction limited 7.2 Megapixel APS-C sized sensor. This is about 19 um and is reached at about f/13.
And that is the f/stop at which ANY cameras with APS-C sized sensor of at least 7.2 Mpixels will be diffraction limited for 8x10" print.
This limiting f/stop value would only change if:
- the sensor would be larger than APS-C -> the limiting f/stop would be smaller
- the sensor would be smaller than APS-C -> the limiting f/stop would be larger
- the sensor has less pixels than 7.2 -> the limiting f/stop would be smaller
Of course the computed f/stops above would shift if one would consider different definition of the diffraction limit (limiting size of CoC)
To be able to estimate at which point the diffraction becomes visible I (based on THIS online calculator) decided that once the circle of confusion (CoC) reaches size of two pixels we hit the diffraction limit (see the above link for details).
First of all - let's check at which f/stop the output from given sesor will be diffraction limited. This is easy to follow, as the only variable here is the pixel size. As the comparison above was between X100 and NEX-7 let's have a look at these two:
X100:
pixel size: 5.9 micro meter (um)
diffraction limited at: ~ f/8
COC ~ 12 um
NEX-7
pixel size: 4.2 um
diffraction limit at: ~ f/4
COC ~ 8um
So far so good. NEX-7 has smaller pixels so it gets diffraction limited at larger apertures - no surprise here.
Now however we move to the prints. So we want to make a 8x10" print, sa at 300 dpi. For that we need at least 7.2 Mpixels and both cameras offer more than that. In other words - we need at least 7.2 effective Mpixels to get sharp 8x10" print. So we can just compute the size of the circle of confusion for diffraction limited 7.2 Megapixel APS-C sized sensor. This is about 19 um and is reached at about f/13.
And that is the f/stop at which ANY cameras with APS-C sized sensor of at least 7.2 Mpixels will be diffraction limited for 8x10" print.
This limiting f/stop value would only change if:
- the sensor would be larger than APS-C -> the limiting f/stop would be smaller
- the sensor would be smaller than APS-C -> the limiting f/stop would be larger
- the sensor has less pixels than 7.2 -> the limiting f/stop would be smaller
Of course the computed f/stops above would shift if one would consider different definition of the diffraction limit (limiting size of CoC)
BillBingham2
Registered User
This fancy math is making photography seem too much like school. I'm going back to film!
This conversation reminds me of one I was on the middle if at the Apple World Wide Developers Conference back in the late '80s. The head of chip design and my boss were talking about pushing electrons around a circut board for chips faster than 1 MHz and the potential for them flying off. Well they figured something out. Theories are just that, theories, things you can break past.
B2
This conversation reminds me of one I was on the middle if at the Apple World Wide Developers Conference back in the late '80s. The head of chip design and my boss were talking about pushing electrons around a circut board for chips faster than 1 MHz and the potential for them flying off. Well they figured something out. Theories are just that, theories, things you can break past.
B2
Last edited:
kshapero
South Florida Man
Here is my simple math: I preordered the NEX 7. When it comes in and if for some reason I don't like it, I have 14 days (or 30) to send it back for a 100% refund. So 14 or 30= 100.
Limited brain capacity requires me to stick to simple math.
Limited brain capacity requires me to stick to simple math.
sevo
Fokutorendaburando
Theories are just that, theories, things you can break past.
No, in a scientific sense they aren't. If you can break past a theory, you proved it wrong (or rather, proved its limitations). What's more, theories merely explain the "how" behind a demonstrable phenomenon - the latter exists regardless of how many conflicting theories attempt to explain it, no matter whether any of the theories is right, and (unless we enter the realm of radical subjectivism) it already existed before there was any theory.
No matter whether how we can redefine the laws of physics and optics, a different "how" will not alter the easily observable phenomenon that photographs do not get any better defined if we stop down beyond the diffraction limit...
Matus
Well-known
C'mon guys - my numbers are simple 
But surely anybody is most welcome to ignore the cumbersome math and just make a few prints with different f/stop to see. It will give you much better idea than bunch of numbers (in this case). Though the 'numbers' may serve as reasonable starting point for this kind of tests.
I just find it hard to ignore my scientific background sometimes. Bare with me, please. Hey, I already skipped using LaTeX on photo forums
But surely anybody is most welcome to ignore the cumbersome math and just make a few prints with different f/stop to see. It will give you much better idea than bunch of numbers (in this case). Though the 'numbers' may serve as reasonable starting point for this kind of tests.
I just find it hard to ignore my scientific background sometimes. Bare with me, please. Hey, I already skipped using LaTeX on photo forums
ColSebastianMoran
( IRL Richard Karash )
Hey, everyone, this isn't so hard. For anyone who has read maths at university...
1. More pixels give bigger prints, but only if the lens can put real data, not blur, into the pixels. Even for the highest quality lenses, diffraction reduces resolution when stopped down too far. What is too far? When is diffraction more limiting than the pixels in your sensor?
2. At high pixel density, this can be at a surprisingly low numbered f-stop.
3. For my Lumix LX5, for example, we have lots of pixels in a small sensor. When the lens is closed below f/4.6, diffraction is limiting the resolving power of the camera system. I don't use f/11 on this camera.
4. When we put old glass on the new digital bodies (I do it... It's very enjoyable) watch your f-stop. If you are safe from diffraction problems in 35mm at f/11 or f/16, it might be a much lower numbered f-stop with APS, u4/3, or other small sensors.
5. With smaller sensors, there are fewer f-stops not affected by diffraction.
6. For example, my little Canon A560IS P&S basically shoots everything at f/4 or f/5.6. That's for good reason; image quality is reduced by diffraction if stopped down beyond f/4. In practical terms, I don't have much choice of f-stops with this camera.
7. For 12MPx or 14MPx APS sensors, diffraction hurts beyond f/11.
These considerations are real and affect our images. Photography has always included technical considerations. I like that.
1. More pixels give bigger prints, but only if the lens can put real data, not blur, into the pixels. Even for the highest quality lenses, diffraction reduces resolution when stopped down too far. What is too far? When is diffraction more limiting than the pixels in your sensor?
2. At high pixel density, this can be at a surprisingly low numbered f-stop.
3. For my Lumix LX5, for example, we have lots of pixels in a small sensor. When the lens is closed below f/4.6, diffraction is limiting the resolving power of the camera system. I don't use f/11 on this camera.
4. When we put old glass on the new digital bodies (I do it... It's very enjoyable) watch your f-stop. If you are safe from diffraction problems in 35mm at f/11 or f/16, it might be a much lower numbered f-stop with APS, u4/3, or other small sensors.
5. With smaller sensors, there are fewer f-stops not affected by diffraction.
6. For example, my little Canon A560IS P&S basically shoots everything at f/4 or f/5.6. That's for good reason; image quality is reduced by diffraction if stopped down beyond f/4. In practical terms, I don't have much choice of f-stops with this camera.
7. For 12MPx or 14MPx APS sensors, diffraction hurts beyond f/11.
These considerations are real and affect our images. Photography has always included technical considerations. I like that.
willie_901
Veteran
Don't Feel Bad
Don't Feel Bad
JS and John,
Don't worry. No one understands diffraction. A lot of people know how it works. No one knows why it's that way.
The observation of diffraction is the basis of the quantum mechanic's mystery of wave–particle duality. The camera sensor counts photons as if they are particles, and at the same time diffraction can occur which can only be explained by the wave nature of light. Every time someone takes a digital photograph with diffraction present, they are duplicating an experiment that forced people to acknowledge that Newtonian physics alone can not describe the measurable universe.
Don't Feel Bad
JS and John,
Don't worry. No one understands diffraction. A lot of people know how it works. No one knows why it's that way.
The observation of diffraction is the basis of the quantum mechanic's mystery of wave–particle duality. The camera sensor counts photons as if they are particles, and at the same time diffraction can occur which can only be explained by the wave nature of light. Every time someone takes a digital photograph with diffraction present, they are duplicating an experiment that forced people to acknowledge that Newtonian physics alone can not describe the measurable universe.
Field
Well-known
For those not technically inclined, in reference to my earlier statement -
You can not limit the size of a print with diffraction technically. If the on-board computer was smart enough to shrink the picture in order to compensate, it would decrease resolution, but they currently do not do this... What happens is you just get photos that lose their sharpness when stopped down farther and farther into the zone of more and more diffraction. Your depth of field at some point would appear to almost lower depending on the complexity of the background - it can also get all kaleidoscope on you. You can still print large but it will look weird up-close in particularly around the outside, and poor obviously.
You can not limit the size of a print with diffraction technically. If the on-board computer was smart enough to shrink the picture in order to compensate, it would decrease resolution, but they currently do not do this... What happens is you just get photos that lose their sharpness when stopped down farther and farther into the zone of more and more diffraction. Your depth of field at some point would appear to almost lower depending on the complexity of the background - it can also get all kaleidoscope on you. You can still print large but it will look weird up-close in particularly around the outside, and poor obviously.
Last edited:
ColSebastianMoran
( IRL Richard Karash )
Hmm... Anyone have a pointer to sample images that show the impact of diffraction? Especially for small-sensor cameras?
Last edited:
willie_901
Veteran
Gravity Is A Theory Too
Gravity Is A Theory Too
People have been trying very hard to break past quantum mechanics (QM) theory since 1910 or so. For instance, Einstein hated the teleological implications of QM and he couldn't beat it. The first person who can make measurements that violate QM will win a Nobel Prize and become one of the most celebrated physicists in history. The probability that a camera manufacturer is going to overcome QM diffraction limits is immeasurably small.
Gravity Is A Theory Too
.... Theories are just that, theories, things you can break past.
B2
People have been trying very hard to break past quantum mechanics (QM) theory since 1910 or so. For instance, Einstein hated the teleological implications of QM and he couldn't beat it. The first person who can make measurements that violate QM will win a Nobel Prize and become one of the most celebrated physicists in history. The probability that a camera manufacturer is going to overcome QM diffraction limits is immeasurably small.
bfffer
Established
i own a nex5 and quite disappointed.
1, focus is slow
2, focus is inaccurate
3, too much shutter vibration
4, shutter too loud
if they fix these for Nex7 it will be gold.
1, focus is slow
2, focus is inaccurate
3, too much shutter vibration
4, shutter too loud
if they fix these for Nex7 it will be gold.
semilog
curmudgeonly optimist
APS-C has already pretty much hit its limits, what you buy now will not be significantly outdated in terms of sensor specs any more.
I respectfully disagree. APS-C (and 4/3) are gradually approaching real physical limits but have not yet hit those limits.
First, current sensors are not close to oversampling a lens that is diffraction-limited at f/2.
Second, the calculations that we're all talking about are for a monochrome sensor! Remember that the specified pixel pitches for real digital cameras are not for monochrome sensors – they are for Beyer RGB arrays.
On a Beyer array, the real pitch for red and blue light is 2x the specified pitch, and for green light it's (roughly) 1.4 x (square root of 2) times the specified pitch. The "pixels" generated by a RAW converter are demosaiced interpolations!
In addition the specified pitch is only correct for the horizontal or vertical axes. The pitch on a rectangular array along the diagonal is, again, another 1.4x (again, square root of 2) bigger than the specified pitch.
Thus the specified pitch of a real, practical RGB color sensor considerably overstates the actual spatial resolution of the sensor. Real resolution is (depending on the color of the incident light, the axial tilt, the presence of antialiasing filters, and other factors, always much worse than the pixel pitch would suggest.
Third, there can be significant technical advantages to oversampling, not least of which is the fact that you no longer need to correct for aliasing with optical or digital antialiasing filters. There's also the fact that Nyquist sampling doesn't work when the signal is noisy. Spatial averaging of an oversampled signal can, depending on the noise costs, be a good way to compensate for this.
These considerations all argue for the technical merits of pixel arrays considerably denser than the ones currently available.
Moreover, with sufficient computational power, it is possible to exceed the Abbe limit if you know the lens's point spread function and can computationally deconvolve. This has been a standard technique in optical microscopy for well over a decade (and in astronomy before that), and it can provide roughly a factor of 2 increase in effective linear resolution. I'd wager that this is already being done in the iPhone 4's (stunningly good for its size) camera – which has a pixel pitch of ~2 µm (500 px/mm)! Don't think for a moment that Apple and Sony don't know what they're doing with that sensor.
(Note: a FF DSLR with 2µm sensor pitch would be 12,000 x 18,000 = 216 Mpix. And an APS-C camera with that pitch would be ~137 Mpix!)
Finally, there's a long way to go (with respect to the theoretical limits) with APS-C, in terms of sensitivity, dynamic range, and color space. To take just one example: none of the available APS-C sensors is backside-illuminated (as the iPhone 4's camera is). That change alone can give you about a full stop of real sensitivity with no increase in read noise.
So, no, I don't think that we're done with APS-C or 4/3 or even 35mm sensor development. Not really even close.
Last edited:
Dr Gaspar
Established
Could someone translate this into human language? From what I got, depending on the sensor, you could be getting less resolution from larget F stop number?
Share:
-
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.