Blog: The Great Megapixel Swindle

Coincidentally, I was waiting around In Sam’s club a couple weeks ago while I was having a roll of BW400CN developed and came across the very same camera that the reviewer in this article is bashing. It was on their clearance shelf (last one, but not their display model) for less than half price. Completely on impulse, I purchased it and have found it is truly a poor excuse for a camera. It is so bad that I’m not sure it is fair to use it as a representative sample of current point and shoot cameras. It’s not as bad as the first Casio digital camera I bought in 1999, but I’m sure that it produces worse images than any digital camera I’ve owned since, including those in the sub 4MP range, with the possible exception of the "camera" in my cell phone. I could go on and on about its shortcomings, but suffice it to say that I will never use it for anything serious. On the positive side, it is really, really small.
 
Last edited:
If you look at the crop shown in the article you see that it is a background part filled with bokeh. It is not an F22 shot. The ship is in focus but the background tree is not. Mainly a BS argument - do the same shot with MF and will the tree be clear or a Matisse like shot?
Also now that 10-12MP cameras have matched 35mm quality they are going for ISO to make digitals match film at all levels. Simply because you consider film loving as your religion does not mean that digital has not improved greatly since it's introduction. I think many people who continually slam everything digital have not really tried digital. (my opinion, maybe not fact in all cases).

Steve


OMG! Did you call me a "film lover"?!?

Obviously it's easier for you to write your post without considering what I wrote. That would actually take some reading comprehension skills on your part, and wouldn't allow you to consider yourself superior to me simply because of how you take your pictures.

If you are trying to argue that I am not rational because I look at the various pros and cons and decided the cons far outweigh the pros, that's your option. But it just makes you look like a rabid defender of how you spend your money.

Nobody would ever argue that "110 film is better than medium format these days."

Strangely, many people make that claim about digital systems with a sensor the same size. And we all should know the benefits of sensor size or film size are evident to even the most casual observer. Issues like grain and color fringing generally reduce as the sensor size increases because you are enlarging it less for the same size print. Characteristics of the medium and aberrations of the lens are less apparent when you aren't enlarging as much.

You can't change physics. I don't care how much money you have invested in the idea that you can. It's just not rational to think you can.

A complaint many have against religions is they aren't "rational." So the blind adoration for all things digital is more in comon with religion than a rational rejection of an inferior imaging system with inferior viewfinders, inferior lenses and inferior output.

How many times does someone post on a forum, "I bought brand X DSLR kit and my pictures look like crap? Colors look dull and lifeless and it isn't very sharp." And how often is the response, "Shoot in RAW and post-process each and every shot. You HAVE to do this or you are just wasting the money you spent on the kit."

Yes, there are many things you can do in a darkroom to enhance the look of a film shot. But in general, if the picture was exposed and processed and printed by a fully automatic system, your film shots will look great at 4x6. It's a mature technology. Not so much with digital cameras.

Consumer digital cameras are not a mature technology. Of course there have been improvements. But then there was a huge amount of room for improvement even ten years ago. Today more and more people are finding that even with all the improvement, very little real progress is being made. And they are losing faith.

But yes, I'm a crazy old coot for suggesting that modern digital cameras suck compared to even modest consumer film kits. Let's ignore the fact that as a computer programmer and technician, I'm fully aware of the differences between digital and analog systems. And perhaps have the perspective to see that digital is not the best way to do everything. It has advantages, but every advantage comes with a drawback.

I'm not a member of your digital camera religion. So sue me. But you look like the worst kind of fool when you try to make me sound irrational because I don't share your blind faith. I see it the other way.
 
OMG! Did you call me a "film lover"?!?

Obviously it's easier for you to write your post without considering what I wrote. That would actually take some reading comprehension skills on your part, and wouldn't allow you to consider yourself superior to me simply because of how you take your pictures.

If you are trying to argue that I am not rational because I look at the various pros and cons and decided the cons far outweigh the pros, that's your option. But it just makes you look like a rabid defender of how you spend your money.

Nobody would ever argue that "110 film is better than medium format these days."

Strangely, many people make that claim about digital systems with a sensor the same size. And we all should know the benefits of sensor size or film size are evident to even the most casual observer. Issues like grain and color fringing generally reduce as the sensor size increases because you are enlarging it less for the same size print. Characteristics of the medium and aberrations of the lens are less apparent when you aren't enlarging as much.

You can't change physics. I don't care how much money you have invested in the idea that you can. It's just not rational to think you can.

A complaint many have against religions is they aren't "rational." So the blind adoration for all things digital is more in common with religion than a rational rejection of an inferior imaging system with inferior viewfinders, inferior lenses and inferior output.

How many times does someone post on a forum, "I bought brand X DSLR kit and my pictures look like crap? Colors look dull and lifeless and it isn't very sharp." And how often is the response, "Shoot in RAW and post-process each and every shot. You HAVE to do this or you are just wasting the money you spent on the kit."

Yes, there are many things you can do in a darkroom to enhance the look of a film shot. But in general, if the picture was exposed and processed and printed by a fully automatic system, your film shots will look great at 4x6. It's a mature technology. Not so much with digital cameras.

Consumer digital cameras are not a mature technology. Of course there have been improvements. But then there was a huge amount of room for improvement even ten years ago. Today more and more people are finding that even with all the improvement, very little real progress is being made. And they are losing faith.

But yes, I'm a crazy old coot for suggesting that modern digital cameras suck compared to even modest consumer film kits. Let's ignore the fact that as a computer programmer and technician, I'm fully aware of the differences between digital and analog systems. And perhaps have the perspective to see that digital is not the best way to do everything. It has advantages, but every advantage comes with a drawback.

I'm not a member of your digital camera religion. So sue me. But you look like the worst kind of fool when you try to make me sound irrational because I don't share your blind faith. I see it the other way.

I quite carefully considered what you wrote. And I did not resort to personal insults while addressing your concerns.
I wonder how well you have weighed the pros and cons. If you consider an old film P&S superior to a modern dslr than I have concern about your partiality. While I have owned good film P&S's they do not match a top of the line dslr. Yes, I am saying top of the line since we are comparing modern film to modern digital.
You are correct that the film / sensor size matter. I use a full frame digital and medium format film. A digital P&S with a small sensor is now rivaling 35mm film. Soon they will be the same. You are right about grain also, it should not be in film or digital (also called noise). If I do not see it through the viewfinder then I should not see it in the image later.
Yes, many times people complain about the digital output. You do have to 'develop' the pictures as some people here develop film. You don't have to do each one (why do bad shots) but yes, you have to invest some time. OR you could shoot for jpeg and learn to get proper results - like going to the drugstore and getting 4x6's.
I think digital is more mature than you are giving it credit for. The high ISO is a great sign of the maturity. Larger sensors in smaller cameras, better software and better white balance are also signs of maturity.

I do not consider myself superior to you because I use a mix of film and digital. I see the difference between the 2 in my own shots and others here. I do not have a blind adoration of all things digital. I do not always choose digital and I do not make sweeping statements about viewfinders, lenses or output simply because they are film or digital.
Being a programmer and tech have nothing to do with the subject and are just a red herring in this discussion. I do not belong to a blind faith in digital as you seem to be about film. Maybe it is time for you to take another look at things?

Steve
 
I bought a Canon G10 14.6mp when they were introduced, on advice from a pro-photographer friend who had taken it to Thailand (he leads photo tours), and I looked at his images and went right out and spent the dough. What I failed to realize was that I was looking at his images on his laptop - gee, did they ever look good!

However, when I tried making 11 by 14 prints I became seriously unimpressed. I had erroneously assumed you could shoot at ISO 400. Not the case, ISO 80-100 seemed to be the limit. So I sold the camera and never looked back. My first DSLR, the Pentax istDS, had something like 6.1 megapixels and produced much better prints at the above size.
 
swj617 -
This is what you wrote: "Simply because you consider film loving as your religion"

Don't ask me to believe you weren't dismissing my opinions as the product of some irrational belief system. You were very clear on stating your assumptions.

You are right about grain also, it should not be in film or digital (also called noise). If I do not see it through the viewfinder then I should not see it in the image later.

This is quite a strange philosophy. If I am shooting black and white, I do not expect the finder to show me the world in black and white.

I use rangefinders. I do not expect my finder to show me what the film will capture. I use the finder for focus and composition, not as a preview of what is being captured.

I shoot film. Grain in film is not noise, it is the very structure that forms the image. It is analogous to pixels. Of course you are going to see it to varying degrees.

Digital noise, however, is an artifact of the technology. It isn't a result of capturing an image, it is the result of capturing an image using an imperfect electronic system. It's always present but at lower ISO settings the signal generated by light hitting a sensor is far higher than the level of noise.

There is a fundamental difference between high ISO film and a high ISO setting on a camera. The film is actually more sensitive to very small differences in levels of light, while the sensor's sentivity to very small differences in levels of light does not change. It needs to amplify the signal to a great degree to make those differences visible. And in low levels of light, the difference between light hitting the sensor and the base noise level is almost nil. That's why high ISO digital captures are full of noise. You'd have to either greatly decrease the base noise level or greatly increase the sensitivity of the sensor to very small differences in received light *without* increasing the base noise level to improve high ISO performance.

When I look at sample high ISO shots from the new crop of digital cameras, I'm seeing an increase in the amplification without a corresponding decrease in base noise levels or any change in the sensor's ability to distinguish between very low levels of light. Yes I can now set the ISO to an insanely high number, but all I've done is turn the amplifier up higher, greatly increasing the amount of noise I see in the image. Yeah there's an image, but it has more visible noise than at a lower ISO setting. You didn't have that ISO option before because tolerance for the amount of noise generated made that setting a non-feature.

***

You say "A digital P&S with a small sensor is now rivaling 35mm film." No it's not. It "rivals" 110 film.

You can put a billion pixels in a square centimeter but it won't increase the resolution of the lens. People don't determine the resolution of a film system by counting the grains of silver, why would they measure the resolution of a digital system by counting sensor pixels?

There is a difference between the number of sensor sites and resolution of the system.

And you *cannot* compare the resolution of 35mm film by scanning a negative. If you did that you'd only be comparing the resolution of your scanner to your digital camera.

If you really wanted to compare actual system resolution in real world application, you need to make a high quality print of the same size from each system, digital print and film wet print, and compare the prints. You could scan both prints on the same scanner at the same settings for internet comparison's sake, if you wished. But you can't take a first-generation digital image and a second-generation scan of a film negative and use those for valid comparison.

My experience has a lot to do with this subject. I am informed on technical topics. I have worked with digital images and am very familiar with the difference between a diagonal line drawn in pencil and a line that looks diagonal drawn on a screen. This has been a huge topic for decades as computer scientists and professionals strive to improve the mimicking of reality in digital form. And the differences between reality and digital mimickry are too obvious for anyone with knowledge of the systems to pretend there are none. There certainly is no motivation to do so.

I never said there have been no significant advances in digital imaging. But if you actually understood how digital imaging systems actually work, you wouldn't try to suggest they bypass physics.

The idea that my knowledge and experience is irrelevant is interesting. Why don't you just say you believe what you believe and facts are irrelevant?
 
Last edited:
-40oz

I just wanna say that I completely agree with your argument. I also work in the technology field and it's amazing how few people understand that even technology has limits. I believe that APS-C sized digital CMOs sensors probably have reached close to their theoretical capabilities for quite some time, and now some of the newer DSLR's are implementing on-sensor noise reduction because they just can't improve on the raw sensor capabilities any more.
 
If you look at the crop shown in the article you see that it is a background part filled with bokeh. It is not an F22 shot. The ship is in focus but the background tree is not.

Good point, sjw617. I'm surprised it took 25 posts to come to it. In defense of the others, I didn't either. Not that the argument is meritless by a long shot, but still....not fair to blow up any outa focus portion in a narrow depthafield photo, and call it crummy.
 
40oz,
Yes, I said that because your arguments lead me to believe that you do not consider both sides of an issue. Your mind seems to be set in film is always best and you will never change my mind.
While shooting B&W do you see grain in the subject you are shooting? If not then I think you should not see it in the resulting image. Grain is not the result of capturing an image but the processing (developing) of the image. Of course you will not see B&W through the finder but you do not see grain either. I called digital 'grain' noise but I guess film grain could be called noise also since it is distortion.
The newer sensors have increased sensitivity and have made high ISO much better. Higher ISO on film will increase grain as a sensor will increase noise - basically the same issue.
I think you should revisit looking at modern sensors. Rivaling 110 film? Blew past that at about 5 or 6MP. And I know the amount of MP alone does not mean much by itself. It in the quality of the pixels.

It is interesting that you bring up scanners. If you scan you are converting to digital (I know it's obvious) and that will 'limit' your quality to the MP count of the scanner (I do not know what that is). Really a subject for a good thread. I would be interested to see what people who scan, have to say.

I still think being a programmer is not relevant to this discussion and I am not ignoring how a sensor works or think that they can bypass the laws of physics. I do not believe that facts can be ignored but I think you should revisit your research on the subject. Even if you did your research a couple of years ago things have changed greatly since then.

Steve
 
Did you even read my post before replying?

You are right. Physics is trumped by wishful thinking. You don't need to check my facts, just substitute your opinions for reality whenever there is a conflict and you'll go far.
 
Strangely, many people make that claim about digital systems with a sensor the same size. And we all should know the benefits of sensor size or film size are evident to even the most casual observer. Issues like grain and color fringing generally reduce as the sensor size increases because you are enlarging it less for the same size print. Characteristics of the medium and aberrations of the lens are less apparent when you aren't enlarging as much.

You can't change physics. I don't care how much money you have invested in the idea that you can. It's just not rational to think you can.

I never said there have been no significant advances in digital imaging. But if you actually understood how digital imaging systems actually work, you wouldn't try to suggest they bypass physics.

Maybe I'm a bit thick, but your physics references are lost on me. What are you talking about? Sensor vs. pixel size? Luminosity? Both? Neither? Something else entirely?
 
I don't know: 6MP DSLR shots looks like crap when printed large, and I see it.
 
Rivaling 110 film? Blew past that at about 5 or 6MP.

A. What it seems to me you might be overlooking is importance of actual the physical size of the film plane. The actual physical size of the plane the lens is focusing the image on has a great deal to do with the quality of the image rendered. Most objective folks will tell you that a properly exposed image on an average MF camera will look a lot better than the highest end 35mm film camera because "small format" has a substantially smaller "rendering area" - by far, than a medium format or large format frame. Imagine a digital camera with a sensor that's 2-1/4 inches X 2-1/4 inches wide. Small format provides a "good enough" image and are far more portable than larger format cameras. The size of the film plane is what drives the size of the rest of the camera. 35mm concedes a fair amount of image quality relative to medium format for the sake of portability. Additionally, the large the size of the film plane the more stops you can play with regarding selective focus and the smoother the transition is between in focus and out of focus areas, so medium format can be destinguished by having, often, a beautiful dimensional illusion along the Z access. Digital PnS cameras are not capable of selective focus due to the tiny size of their sensor. Most incredible images of all are large format prints, but a large format camera is impractical - obviously, to walk around with. This is why "full frame" is the digital holy grail, and I might see your point more if your argument was that "full frame" digitals are the equal of 35mm. An APS-C sensor is quite a bit smaller that the 35mm frame, and the difference in quality is noticable, no matter how many megapixels the sensor contains or their quality. It has more to do with the physical size of the area the image is rendered on.

B. Your position also assumes that "all noise is created equal". While subjective, certainly, I think you'll agree that most would say that grain is not nearly as obtrusive as digital noise. You may not agree with this - and that's certainly valid, but your opinion I think you'll agree would be a minority one.
 
... I will further add, that while millimeter for millimeter, sensors might have evoloved to the point where they have equal rendering capability to analog film (though they still blow out highlight detail... won't get in to that). However, they simply cost too much to produce at the size of a 35mm frame for the consumer market. Thus, to keep the cost of a digital camera in line with what the average consumer can afford, smaller sensors are used, and the concession is image quality, and that includes all consumer DSLRs that are not full frame. There is a reason why medium format sensors exist and that is because of image quality, and there is a reason why cameras with medium format sensors, like the Hassy HD3-39II, will set you back $22,000. http://www.amazon.com/Hasselblad-H3D-39II-Medium-Digital-Display/dp/B000Z0JZ74 The dirty little secret, to me, is that digital camera makers created a "diversion" and successfully swindled consumers into thinking that more megapixels = better image quality. When in reality, it is "more square inch of sensor" = better image quality, just like in the film world. This diversion was created to hide the fact that digital cameras, in reality, are like 110 cameras in terms of image quality capabilities because sensors larger than this are too expensive. And "110-quality" is the best they will ever be able to do because the physical dimensions of the sensor are what they are...
 
Last edited:
Did you even read my post before replying?

You are right. Physics is trumped by wishful thinking. You don't need to check my facts, just substitute your opinions for reality whenever there is a conflict and you'll go far.
40oz,
Yes, I have read your reply and answered each point you made. I have had your posts open in another window as I answered you.
I am tired of twice being accused of not reading your posts.
I am tired of being called a fool.
I am tired of being called ignorant.
I am tired of being accused of trying to change physics.
I am tired of being attacked and am not going to put up with your cr@p any longer.
 
Ignoring the olympus used in the article, I had a D1, and a 14n, while megapixels is not everything its cracked up to be, there IS a gain, files from 2000 cameras carries significantly less detail resolution than the average 2010 camera.


No no no no no. You have it all wrong: nuance and logic have no place in the Great Intertoobes' Digital "Debate".

You gotta start calling out names, yell, and if at all possible, sneak the word "film" or "soul" in there. 😉
 
Being a programmer and tech have nothing to do with the subject and are just a red herring in this discussion. I do not belong to a blind faith in digital as you seem to be about film. Maybe it is time for you to take another look at things?
Steve

This is a great point. In high-end light microscopy (where I work), solid state sensors and photomultiplier tubes totally supplanted silver several years ago, based not on convenience but on technical merits. As a profession we all realized quite a while ago that CCD sensors allow us to detect and measure a wide variety of phenomena far, far better than film ever will. And the astronomers realized it before the microscopists did.

The same is now happening (some would say, has already happened) in high-end electron microscopy, where resolution rather than pure quantum efficiency and SNR are the dominant criteria.

If you want to argue on technical merits, film is either gone, or will be soon.

I use film for qualitative (aesthetic) and process reasons. I like film cameras. I like developing film. I like film tonality and (within reason) grain. These motives are defensible; but arguing that film is "better" than digital sensors is a fool's errand.
 
Last edited:
Um, they did.

http://www.dpreview.com/news/0209/02092304kodakdcs14n.asp

Photokina 2002, they introduced the Kodak Pro DCS-14n, based on a Nikon body. Full-frame, 14 MP, no blur filter. The price wasn't even that bad (for the time frame).

In 2004, they upped the ante:

http://en.wikipedia.org/wiki/Kodak_DCS_Pro_SLR/n

I'd say Kodak was in there swinging. The market wasn't ready.
Sorry for barging in here late, Bill. What I want to know is: in light of all this, why, exactly, has Rochester been repeatedly slagged on "missing" the digital boat altogether? I agree they've made some boneheaded moves in specific ways, but knuckle-draggers they weren't, and I say this as someone who prefers film for his own output.

And, speaking of the film-digital stuff: my mainline digital rig is an aging Olympus C-8080, whose "mere" 8mp continue to please my needs where digital output is concerned. I also appreciate the alacrity of my pocketable Casio EX-850, even though it's limited to JPEG output (also 8mp). Perhaps there's a metric in all this that's being missed here?


- Barrett
 
When I look at sample high ISO shots from the new crop of digital cameras, I'm seeing an increase in the amplification without a corresponding decrease in base noise levels or any change in the sensor's ability to distinguish between very low levels of light.

Incorrect.

Base noise levels have been going down steadily, as read circuitry has been improved. Of course, read circuitry cannot eliminate shot noise. However, quantum efficiency has also been steadily increasing, with 2010 seeing the first wave of consumer devices with back-illuminated sensors. Improved quantum efficency means more photoelectons, and hence lower shot noise.

SNR is steadily improving, for both CCD and CMOS devices. Many manufacturers use this improvement to increase pixel count, which is in a lot of cases a questionable, marketing-driven decision. However, it's not even clear that more, higher noise pixels are always worse. See here for a serious real-world discussion of this point.
 
Last edited:
Back
Top Bottom