Nigel Meaby
Well-known
I believe you when you say they have wonderful detail and thanks for posting them but they both look pretty flat and lifeless to me. Has anyone seen some examples posted yet that have wonderful shadow detail, sumptuous blacks and plenty of highlight detail?
The Sobol photo's on the Leica website are over processed and way to contrasty for my liking and the other examples I've seen have been flat and grey.
I'm not saying this camera can't produce but I'd really like to see some good examples!
The Sobol photo's on the Leica website are over processed and way to contrasty for my liking and the other examples I've seen have been flat and grey.
I'm not saying this camera can't produce but I'd really like to see some good examples!
BobYIL
Well-known
For Dante and others who would like more info on aspects of photon counting, diffraction and airy disk as well as the effects of the OLPF on apparent resolution, I suggest a very recent article by Allan Angus; I think you’ll find it quite insightful.
http://stonerosephotos.com/blog/2012/04/resolution-and-diffraction-limits-in-dslrs/
http://stonerosephotos.com/blog/2012/04/resolution-and-diffraction-limits-in-dslrs/
Audii-Dudii
Established
FYI, there is a comparison between the Phase One P45+ and Achromatic + medium-format digital backs (which share the same basic sensor) at the end of this article: http://www.luminous-landscape.com/reviews/cameras/achromatic.shtml
While it's clear there is a resolution difference between the output of the two backs, it's equally clear the resolution difference is not a factor of two...
While it's clear there is a resolution difference between the output of the two backs, it's equally clear the resolution difference is not a factor of two...
135format
Established
But the interpolation from the Bayer filter is for the pixel colour and not the luminosity; presumably, you're seeing more or less the same content as without the filter, sans colour
Both colour and luminosity are being interpolated I beleive. You are seeing less with bayer filter. The new sensor will be technically more accurate, there is no denying that. But its tonality appears to be off and that is its draw back. Every image needs processing corrections.
You could argue that Leica have played smart by not trying to make the output look like film so that users can pick a film type in SEF. You could also argue that their wealthy client base would much rather have a film type looking image straight from camera which is a destructive process. i.e. will reduce resolving power of camera output. Well I think RAW should be unprocessed and their should be an option to select some film types on camera even if it does reduce perceived resolving power of output.
135format
Established
This one shows (remind me why I never upload photographs to this forum, they must be so small as to be practically worthless....) the highlight situation. I should have exposed more for the highlights and pulled up the shadows which have a wealth of detail and no noise to speak of (@ ISO 5000!!!)
View attachment 90790
jaap, since you have an M9 and a M Monochrom could you do the comparative test I suggested which is the same image from both converted to a film type (tri-x or HP5 ) in SEF and compare the difference between the resulting images.
jaapv
RFF Sponsoring Member.
No, I don’t have one. I was just playing around with a pre-production one before the official presentation. I snitched it off a beta tester. I considered making off with it, but he could run faster than I can....very unsportingly he wanted it back.
timor
Well-known
Thank you for explanation and the link.Now it will be based on the intrinsic spectral sensitivity of the sensor, any overlying coverglass, and the lens configuration and coatings. Nearly all scientific CCDs and CMOS devices are monochrome. Depending on the application they may be optimized to deliver different spectral sensitivity curves. Here are some examples.
jaapv
RFF Sponsoring Member.
Both colour and luminosity are being interpolated I beleive. You are seeing less with bayer filter. The new sensor will be technically more accurate, there is no denying that. But its tonality appears to be off and that is its draw back. Every image needs processing corrections.
You could argue that Leica have played smart by not trying to make the output look like film so that users can pick a film type in SEF. You could also argue that their wealthy client base would much rather have a film type looking image straight from camera which is a destructive process. i.e. will reduce resolving power of camera output. Well I think RAW should be unprocessed and their should be an option to select some film types on camera even if it does reduce perceived resolving power of output.
All files you see are pre-production ones. Leica is fully aware of the tonality discrepancy. Basically it is too sensitive to green. Not that that is a bad thing, as green is where lenses are at their optimum.
Anyway, there will be several firmware updates precisely because of this before the camera is released for sale. The aim is to get the response similar to panchromatic film. I doubt that will be a 100%, but it will be close enough not to matter.
Another thing is that you can get out your old box of yellow, red, etc. filters and start being a photographer again.
The other firmware correction will be to tweak the exposure measuring a bit to get the camera to pull away from highlights a bit more.
And a few other minor gremlins.
So, not to worry.
jaapv
RFF Sponsoring Member.
Thank you for adding scientific insight to my layman’s explanation. Your figures correspond to the ones quoted to me by Leica and various experts I spoke to.
Having no idea about the extent what this "significantly higher resolution" means, I will try to comment in very simple terms:
Anything, including transparent ones, standing on the path of light reaching the “wells” on the sensor cause a loss in resolution besides sensitivity (quantum efficiency) of the sensor as a whole. An IR filter, a Color Filter Array, an antialiasing filter, a wave plate or any optical glass mean a certain loss of resolution as well as a certain drop in sensitivity (for most of us the number of photons collected in the “well” in a given time.)
The red, green and blue color filters cause up to 30% drop in the intensity of light passing through them, which corresponds to up to 4-stop less ISO sensitivity in conventional terms. (Trick here: Green is located around the mid portion of the visible spectrum so our eyes see sharper with green than what they do with the “side” colors like red and blue. That’s why Dr. Bayer has employed two green with one blue and one red in his pattern; do not mind about luminance or chrominance sensitivities.)
Assuming we employed the usual demosaicing algorithms, the elimination of the antialiasing filter is generally known as improving the resolving power of a sensor by around 10%. This is the increase in actual resolution power; increase in contrast, crispness, acuity, etc. are different subjects. As you note, the use of the AA filter can be likened to closing down a lens further when diffraction starts to appear.
The elimination of the CFA, while helping the sensitivity to raise considerably can also assist the resolution power of the sensor to reach its “native” value. What we do here is actually nothing but trying to “retrieve” back the original resolution power as well as its original sensitivity by trying to eliminate the losses for what we placed on it to get color pictures and to stay away from moire.
How far the elimination of the CFA can help a sensor’s resolution? It’s usually around 25% depending on the algorithm used. To have an idea about it, check first the sample comparisons between the D800 and D800E; they are about 10% apart from each other due to the elimination of the AA filter (as some Nikonians admit of not being able to see or being able to compensate by the PP sharpness slider only.) With crude calculations one can state that the resolution of the M9M could be equivalent of a 28MP color sensor.
The following chart illustrates a typical example of the same sensors resolution power, with and without the CFA.
![]()
135format
Established
colour doesn't exist. It is a human perception, an interpolation from elctromagnetic waves. Sensors just measure electromagnetic waves and bayer filters filter wave bands not colour. Only we just assume that because the filters are made of material that we perceive as being Red, Blue or Green that the sensors are seeing colour. In reality they are measuring elctromagnetic waves of the wavelengths that we convert to colour in our visual perceptual system. So no sensors see in colour. They see in luminosity which is an accumulation of all the wavelengths that hit them.
nemjo
Avatar Challenge
re resolving power
re resolving power
Hi all,
according to the announced Leica MM I often read about the immense resolving power of it.
But hey, this type of camera was born to be handheld!
Is it real that without a sturdy (and heavy) tripod and cablerelease, the benefit of that power could be seen on the pictures?
Without peeping?
Just a question.
nemjo
re resolving power
Hi all,
according to the announced Leica MM I often read about the immense resolving power of it.
But hey, this type of camera was born to be handheld!
Is it real that without a sturdy (and heavy) tripod and cablerelease, the benefit of that power could be seen on the pictures?
Without peeping?
Just a question.
nemjo
jaapv
RFF Sponsoring Member.
Answer: just take the best images you can, enjoy the huge dynamic range and clean greytone transitions and don’t worry about resolution nonsense.
When you were taking slides, would you have walked up to the screen to look for gnat’s whiskers? I thought not.
When you were taking slides, would you have walked up to the screen to look for gnat’s whiskers? I thought not.
semilog
curmudgeonly optimist
Falk Lumo has just posted a reasonably careful comparison of D800 and D800E resolution.
semilog
curmudgeonly optimist
Hi all,
according to the announced Leica MM I often read about the immense resolving power of it.
But hey, this type of camera was born to be handheld!
Is it real that without a sturdy (and heavy) tripod and cablerelease, the benefit of that power could be seen on the pictures?
Without peeping?
Just a question.
nemjo
The high sensitivity means higher shutter speeds. Under field conditions, shutter speed will make a bigger difference in handheld sharpness than the intrinsic differences in sensor resolution under discussion in this thread.
Gabriel M.A.
My Red Dot Glows For You
colour doesn't exist. It is a human perception, an interpolation from elctromagnetic waves.
Ooh, Philosophy!
This conversation is not taking place
135format
Established
Ooh, Philosophy!I posit that speech doesn't exist: it is a human construct and an interpolation of vibrations that travel through matter.
This conversation is not taking place![]()
Nice to know you are talking to your computer. Pity I can't hear it.
The point was that all sensors are monchrome or rather they all just measure luminance. We have to use filters to create channels of shorter wavelength bands so that we can convert them back to wavelength bands on output. The process of doing that is destructive and loses detail.
Take the channeling out of the process and the resolving power increases but we cannot covert back to same wavelengths as the sensor received.
Gabriel M.A.
My Red Dot Glows For You
I think tonality will be an interesting question. I read that Leica suggests adding a light yellow filter to mimic the spectral sensitivity of panchromatic film (which are all different anyway). This suggests increasing sensitivity towards the blue end of the spectrum, and might account for some of the images seen so far.
Very interesting. But just one pseudo-question: are you suggesting (that they're suggesting) that it is the sensor having a higher sensitivity towards the blue end of the [visible] spectrum? A yellow filter would absorb some blue, but let pass most everything else (in the visible spectrum). A light yellow filter has a filter factor of about 1 (or 0.5), and a medium yellow filter of 2.
In any case, it is usually the least light "hungry" of the colored filters, so I'm guessing they're trying to simplify the issue and not getting into technical stuff that most people now don't even care or even knew existed.
One of my concerns is that by requiring the use of Silver Efex, the decisions on look are still left until after exposure, which may be important and either positive or negative depending on the photographers outlook. It seems possible though that if you want more post exposure control, you might be better with an M9 or D800e and do the conversion from the colour data.
This is the one thing that is actually confusing to me: does the sensor have a de facto "absolutely panchromatic" (doubtful) response which can reliably (within reason) be expected to respond with the use of colored filters as one would in the "film world"? If not, one seems to be "locked in" as you would, say, having a camera perpetually loaded with Ilford Pan-F. Making the case for using a color sensor in this case. If you're into that sort of thing. Otherwise, I doubt any of this would be any more revelant than clicking "Desaturate" in Photoshop -- or simply change from RGB or CMYK mode to Greyscale.
As far as the resolution "issue"...I think a lot is being lost in translation between the engineering department and the marketing department. Marketing is not usually thought of as the "source of facts".
Tim Gray
Well-known
Gabriel: It's not really philosophy. EM radiation of certain wavelengths exists, and certain pure wavelengths are usually equated with certain colors, as well as combinations of wavelengths. Many of the products we use have 'colors' which do NOT correspond to the pure wavelength like a laser, but instead a spectral distribution. Color IS perception, and we all perceive differently. A lot of this can be quantified and color science is a well developed and interesting field, but it is perception based. Most of the science is based around the 'average' viewer.
It's fine for you and the rest of us to ignore that, but people who make paint and digital imaging sensors need to worry about it, because things like metamerism exist and are important. And when we buy a gray chair, it better well be the same gray under the lighting in my living room as it was in the showroom.
Ignoring that color is a perception and demanding that all colors are spectral colors is where those stupid 'purple is not a color' arguments come from.
It's fine for you and the rest of us to ignore that, but people who make paint and digital imaging sensors need to worry about it, because things like metamerism exist and are important. And when we buy a gray chair, it better well be the same gray under the lighting in my living room as it was in the showroom.
Ignoring that color is a perception and demanding that all colors are spectral colors is where those stupid 'purple is not a color' arguments come from.
Tim Gray
Well-known
This is the one thing that is actually confusing to me: does the sensor have a de facto "absolutely panchromatic" (doubtful) response which can reliably (within reason) be expected to respond with the use of colored filters as one would in the "film world"? If not, one seems to be "locked in" as you would, say, having a camera perpetually loaded with Ilford Pan-F.
I don't know what spectral sensitivity they'll end up with, but once it's decided upon, you are stuck with it. And it's going to be roughly panchromatic (because not doing that would be silly for this kind of camera). If you want something different, you'll have to use color filters. So yes, the camera will always be loaded with Ilford Pan-F, or in this case, Truesense/Leica M-M.
You can look at the test images and see they don't look like trad B&W shot through a red or blue filter, or even a green one. So you can tell it's reasonably panchromatic already.
However, most of the B&W films I use, which are all panchromatic, have different spectral sensitivities. Tri-X is different from Pan-F is different from T-Max (as you state). They all respond reliably to different color filters. Sure a yellow filter on Tri-X is slightly different from a yellow filter on T-Max, just like those two films are different without filters. But I still have a good idea how a yellow filter will affect each of the films, and in use, it all works out just fine. The M-M should work pretty much the same. I expect the only real difference it will have is possibly greater UV or IR sensitivity than your standard film (say, Tri-X or HP5+).
Gabriel M.A.
My Red Dot Glows For You
Nice to know you are talking to your computer. Pity I can't hear it.
Legally, speech is any form of expression, including language. At least in most Western models.
The point was that all sensors are monchrome or rather they all just measure luminance. We have to use filters to create channels of shorter wavelength bands so that we can convert them back to wavelength bands on output. The process of doing that is destructive and loses detail.
Take the channeling out of the process and the resolving power increases but we cannot covert back to same wavelengths as the sensor received.
Yes, color is expensive. It's a good thing that there are engineers and scientists that deal with this sort of thing. I'd hate to see marketing defining what "color" is. I'd rather have them finding ways of translating and transmitting this information, and not paraphrase it.
Share:
-
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.