Film: The Real Raw

as an argument for shooting jpg instead of raw, not very good. as an argument for shooting film, a-ok!
 
I agree that film holds more data than digital RAW files.

I disagree that once stored on film, a scanned-and-processed image can be brought up to whatever the current standard is and still surpass the current (digital) standard.

The limitations on film are not the same as the limitations on RAW digital files, but they exist. The proof of that is the very example he used, the Wizard of Oz movie. Original prints contain all the data that was captured, true. And that's the rub - all the data that COULD be captured using the technology of the time. That means the lens, the film media, and the duping capability that existed then. You can't go back and make that old lens better, you can't make the old film render colors more accurately. You can digitally process it to approximate those things - oops, that's what he hates about RAW already, so I guess not.

Film is great. Digital is great. Film will one day go away, the war is over and film lost. In the meantime, use what you like and enjoy it. I use both, and I enjoy both and really that is what matters to me.

Mr. Rockwell's opinion notwithstanding.
 
Film is great. Digital is great. Film will one day go away, the war is over and film lost. In the meantime, use what you like and enjoy it. I use both, and I enjoy both and really that is what matters to me.

Mr. Rockwell's opinion notwithstanding.

Before we go overboard I just want to ad a disclaimer. I didn't post this with the intention of bringing up a debate that's been performed here ad nauseum. Just thought it was an interesting read. I've never seen a specific RAW and film comparison before. It's usually just a generic digital vs. film.
 
Before we go overboard I just want to ad a disclaimer. I didn't post this with the intention of bringing up a debate that's been performed here ad nauseum. Just thought it was an interesting read. I've never seen a specific RAW and film comparison before. It's usually just a generic digital vs. film.

Fair enough. I have always felt that a properly-scanned frame of 35mm film holds more data than some of the best dSLR cameras out there, even yet.

However...

1) It's getting close. I think the very best FF dSLR cameras can probably meet or beat a good 4400 dpi scanned 35mm neg now. If not, then soon.

2) The more I scan, the more I see the limitations in my negs and slides. Due to the quality of the film itself, the quality (or lack of it) of my processing, the lens I used, and so on. I doubt if doubling the dpi of my scanner would pull more USEFUL data out of my 35mm films at this point. More just becomes more, and no real use for it.

Rockwell's point is well-taken at the surface level - film is RAW-plus. Yes. But film can only lock away the data it is given, not unlike RAW files. Both are limited by the lens, the processing (chemical or digital), and the quality of the recording media itself (film base or sensor). Film better? For the nonce. Forever? No. Old films perpetually better than the latest RAW? No.

And his barbs tossed at how RAW files are assembled is just specious nonsense. It hardly matters if bread is hand-tossed or spun by a machine if you're only interested in how it tastes. In other words, if end-product is what matters, I don't particularly care that a computer had aught to do with it, now, do I? If he's saying that the act of a computer being involved makes it bad by nature, then he should have written that in a book, because he just used the devil to publicize his point.
 
Technically your argument proves film>digital. Everything in your statement is true for both formats and eliminating all the sames from the equation you still end up with film being able to be scanned with current technology and digital images stuck where you captured them.

No, not true. First, film degrades. Even the most stable. Second, it can only record as well as it can record - so old films were not up to the standards of new films - even new scanners can't put in quality that was never there. Third, the limitations of lenses and other incidentals used to record the images have a negative impact on the quality of the image laid down on the films of yesteryear.

And finally, even if film > digital today, it won't be true forever. Digital continues to improve. Film too, but less so. The slope of those curves have crossed, some time ago.

I don't really like rockwell and I think this article is along the lines of write enough rubbish and eventually you'll get something right...so it pains me deeply to agree with him but I do.

I do like him just fine. But in this case, he's full of it.
 
Fair enough. I have always felt that a properly-scanned frame of 35mm film holds more data than some of the best dSLR cameras out there, even yet.

However...

1) It's getting close. I think the very best FF dSLR cameras can probably meet or beat a good 4400 dpi scanned 35mm neg now. If not, then soon.

2) The more I scan, the more I see the limitations in my negs and slides. Due to the quality of the film itself, the quality (or lack of it) of my processing, the lens I used, and so on. I doubt if doubling the dpi of my scanner would pull more USEFUL data out of my 35mm films at this point. More just becomes more, and no real use for it.

Rockwell's point is well-taken at the surface level - film is RAW-plus. Yes. But film can only lock away the data it is given, not unlike RAW files. Both are limited by the lens, the processing (chemical or digital), and the quality of the recording media itself (film base or sensor). Film better? For the nonce. Forever? No. Old films perpetually better than the latest RAW? No.

And his barbs tossed at how RAW files are assembled is just specious nonsense. It hardly matters if bread is hand-tossed or spun by a machine if you're only interested in how it tastes. In other words, if end-product is what matters, I don't particularly care that a computer had aught to do with it, now, do I? If he's saying that the act of a computer being involved makes it bad by nature, then he should have written that in a book, because he just used the devil to publicize his point.

Agreed on all counts. Even today's APS-C sensors are superior to 35mm film in many ways, especially enlargements.
But back to the original point I think this particular piece hit home for me when he brought up the Wizard of OZ point.
I just got a Sony PS3 for Christmas to play bluray movies on (and play Guitar Hero! 😀) and I can definitely see a quality improvement on SOME discs that were re-scanned for high def. The funny thing was, I was ahold-out for a long time while many of my fellow movie buffs were already on the bandwagon. Where they were seeing a lot of detail I was seeing just a lot of pixel noise. But the great equalizer is the TV and my 1080i CRT has a much nicer picture than a mid-range LCD.
But now I'm seeing a real quality difference of HD over SD from the same source. I also got a remastered DVD of It's A Wonderful Life and the quality is totally stunning - even though it's only SD. That's because I've become so used to the crappy print that's been shown on TV for years, but now I'm seeing and hearing things that I didn't know were there before.
Also High Def Digest, a prominent HD review site, is saying that the new bluray edition of Casablanca is absolutely one of the highest quality bluray movies you can get today. So I think that there may be something to Ken's point, within reasonable limitations.
 
I agree that film holds more data than digital RAW files.

I disagree that once stored on film, a scanned-and-processed image can be brought up to whatever the current standard is and still surpass the current (digital) standard.

The limitations on film are not the same as the limitations on RAW digital files, but they exist. The proof of that is the very example he used, the Wizard of Oz movie. Original prints contain all the data that was captured, true. And that's the rub - all the data that COULD be captured using the technology of the time. That means the lens, the film media, and the duping capability that existed then. You can't go back and make that old lens better, you can't make the old film render colors more accurately. You can digitally process it to approximate those things - oops, that's what he hates about RAW already, so I guess not.

Film is great. Digital is great. Film will one day go away, the war is over and film lost. In the meantime, use what you like and enjoy it. I use both, and I enjoy both and really that is what matters to me.

Mr. Rockwell's opinion notwithstanding.

Well, I disagree. There will always be film around. And no, I am not going to get in a long diatribe pissing match like some other threads. Just accept that this is my opinion. 😉

Now, like you said, Bill, go out and shoot!
 
So I think that there may be something to Ken's point, within reasonable limitations.

Yes, I agree. The problem is that he clearly extrapolates that out forever, and it doesn't go quite that far. The next iteration of whatever's next in the way of DVD-type movies may show even more detail from "The Wizard of Oz," or it may not - there may not be more to be extracted.

And it must also be mentioned; we do not know what level of computerized wizardry was used to 'enhance' the original print of the movie when it was 'remastered'. Extrapolation is still just another word for 'a very good guess' in order to replace data that was probably there originally, but just didn't get recorded.
 
I think the article is an interesting read. It takes a perspective I haven't really though much about.

He really comes down hard on RAW files though. I think there's a pretty demonstrable advantage to RAW, if only in exposure latitude and colour balance.
I really don't find any difficulty in using RAW files though. For me, dumping RAWs into Aperture is a lot easier than screwing around with my film scans in Photoshop.

Ultimately though, I don't think it makes enough of a difference to worry about it. Especially with stuff like the S2 around the corner, the limitations are so sky-high that I can't possibly bring myself to care about the limitations of digital, nor do I care to get more detail out of my negatives 20 years from now.
 
Digital seems to excell at sharpness while film gives you a longer tonal scale because you can play with the toe and shoulder regions of the H&D curve. This is true in both B&W and color. The linearity of digital capture precludes this.

Too many of us spend way too much time chasing ultimate sharpness rather than trying to produce good interesting and informative photographs. Robert Frank shot his book The Americans with the lenses and films of sixty years ago. His B&W photos often show motion blur and some exhibit missed focus but they're still powerful images. British photographer David Hamilton's photographs of young girls have a soft dreamy look to them, and supposedly he even took sandpaper to the front element of some of his lenses. (Caution ~ some of Hamilton's photographs show nudity that some might find offensive.)

Neither Frank's nor Hamiltion's photographs would be improved had they been shot digital.
 
Last edited:
(cropped for clarity)...

Film is great. Digital is great. Film will one day go away, the war is over and film lost. In the meantime, use what you like and enjoy it. I use both, and I enjoy both and really that is what matters to me.

Mr. Rockwell's opinion notwithstanding.

I was nodding when reading this, up to the above paragraph.
There are still a lot to be done, but I think there is a chance for film photography to survive in a small but stable niche (compared to digital).
 
And that's the rub - all the data that COULD be captured using the technology of the time. That means the lens, the film media, and the duping capability that existed then. You can't go back and make that old lens better, you can't make the old film render colors more accurately.

I don't think comparing what one could do with a seventy year old movie shot on the earliest color film to what one can do with a modern digital sensor is a valid comparison. A look back at a digital file ten years old leaves a lot to be desired- that Mavica can't hold a candle to those seventy year old film/lenses.
 
I don't think comparing what one could do with a seventy year old movie shot on the earliest color film to what one can do with a modern digital sensor is a valid comparison. A look back at a digital file ten years old leaves a lot to be desired- that Mavica can't hold a candle to those seventy year old film/lenses.

I did not make the comparison - Ken Rockwell did. He said that those old films would ALWAYS be better than the latest RAW file, even in, what, 2099 or whatever. Words to that effect.

I refuted him because he made the claim, I didn't invent it to shoot it down.
 
I was nodding when reading this, up to the above paragraph.
There are still a lot to be done, but I think there is a chance for film photography to survive in a small but stable niche (compared to digital).

Fair enough, fair enough, let's just say we're not all in perfect harmony on this one...I didn't mean to crap on the thread.
 
Rockwell is, as usual, making grand statements that conflict with everything else he says. Cameras don't matter, yet he obsesses about them. Film is the greatest thing ever, yet if you look at his gallery, you only find large format film or digital. The Mamiya 7 is his favorite camera, but his gallery contains none taken with that camera. Or the 5D is his favorite, or maybe it's his Nkon D40, unless it's his D3. I don't know, it's impossible to keep track.
 
His site, and essays especially, bespeak someone with maturity issues. You hit the nail right on the head, katgut. Film is superior, yet let's obsesses about every little niggling detail of the latest DSLRs. Your talent and training as a photographer are far more important than which equipment you use, yet 'professionals' only use Macs. Indeed, just about any essay by him that includes an amateur vs. professional rant is bound to reveal how fragile his ego really is.

My biggest problem with one of his latest essays -- the shoot out between the new Nikon DX3 and the Canon MarkOneBillion -- is how he refuses to compare RAW to RAW, because that would introduce all sorts of computer processing variables. Because, you know, a .jpg file is just like an emulsion of film; of COURSE manufacturers don't implement custom .jpg making algorithms in their cameras. The fact that he's too scared of RAW to work with it only hurts his arguments; maybe Canon just writes crappy .jpg files, and RAW shooters will never notice a quality difference? But he won't even allow for the possibility. And yet, we are to take his essays as definitive. Sometime I'm left wondering if he truly understands much of the technology he rants about; my spouse is a system architect for chips for DVRs, set-top boxes, etc. and says most of Rockwell's claims about digital image processing are specious, at best.

I respect Rockwell's lens reviews, generally because they're usually backed up, in the end, by other sources and my own personal experience. I think the rest of his site is risible. I refused to read it on principle for a bit, then I got over my grumpyness and learned how fun it is to watch Rockwell make an idiot of himself.
 
Last edited:
Back
Top Bottom