Roger Hicks
Veteran
This has interested me for a long time, in terms of pure information held in an image. This is the only useful definition of 'image quality' because it is widely agreed that the 'look' of film and digital -- and of different films, for that matter -- can be very different.
Before you can make any comparison, you have to assume that the camera is firmly bolted down to a substantial tripod and that the lens is perfectly focused. Otherwise, all comparisons are meaningless.
Second, you have to assume the sharpest possible lenses, and slow, sharp film. Let's assume the best Leica lenses and Kodak Ektar 100.
Third, you have to consider your subject matter. The best definition I ever heard for a truly demanding subject was 'a portrait of Art Garfunkel, with every hair sharp and no jaggies' (I understand he has less hair now but the argument holds good).
Fourth, you have to allow that film is a random array, while digital is regular.
Fifth, not all information is meaningful information -- except that there's scope for a BIG argument here about whether grain is 'meaningful' or not. Some say it isn't. I'd say it is, because it's part of what contributes to the 'look' of film. Thus, a 5400 dpi scan of even a slightly soft 35mm shot contains more information in the form of grain than an 1800 dpi scan.
With all this in mind, the broadest consensus I have found among all the industry experts to whom I have spoken, which also mirrors my own experience, is that 35mm has the potential to equate to at least 18 megapixels, and conceivably even twice that with the right subject. Even so, for most purposes with the camera hand held, 12 megapixels is about equivalent to 35mm and for many purposes 6 megapixels is astonishingly good (as born out by putting top-quality Zeiss lenses on my Nikon D70 as compared with the kit zoom).
Cheers,
R.
Before you can make any comparison, you have to assume that the camera is firmly bolted down to a substantial tripod and that the lens is perfectly focused. Otherwise, all comparisons are meaningless.
Second, you have to assume the sharpest possible lenses, and slow, sharp film. Let's assume the best Leica lenses and Kodak Ektar 100.
Third, you have to consider your subject matter. The best definition I ever heard for a truly demanding subject was 'a portrait of Art Garfunkel, with every hair sharp and no jaggies' (I understand he has less hair now but the argument holds good).
Fourth, you have to allow that film is a random array, while digital is regular.
Fifth, not all information is meaningful information -- except that there's scope for a BIG argument here about whether grain is 'meaningful' or not. Some say it isn't. I'd say it is, because it's part of what contributes to the 'look' of film. Thus, a 5400 dpi scan of even a slightly soft 35mm shot contains more information in the form of grain than an 1800 dpi scan.
With all this in mind, the broadest consensus I have found among all the industry experts to whom I have spoken, which also mirrors my own experience, is that 35mm has the potential to equate to at least 18 megapixels, and conceivably even twice that with the right subject. Even so, for most purposes with the camera hand held, 12 megapixels is about equivalent to 35mm and for many purposes 6 megapixels is astonishingly good (as born out by putting top-quality Zeiss lenses on my Nikon D70 as compared with the kit zoom).
Cheers,
R.
Last edited: