vicmortelmans
Well-known
Filmscan 3600
I don't like machines that (pretend to) think for me. I've disabled the 'auto exposure' and 'auto gamma' on the scanner, but still some things are not clear.
The scanner settings allow to choose scanning pixel depth (8 or 16 bit) and scanning mode (normal/quality). Strangely, pixel depth nor scanning mode give me any different output image. I don't seem to get real grayscale as output either: it always generates RGB-images (24bit) with slight differences between the three channels.
What I would like is a real 16 bit grayscale output, but is that common in consumer-level filmscanners?
The scanning mode is a complete mystery. The manual says: "quality" mode may take somewhat longer and will give better... quality (yeah, right).
Another setting that you make is selecting film vendor and type. I wonder what this affects. Is it only because (talking b&w) different vendors have different base density and color cast? Or is also the actual image processed differently?
Still, I'm not fully satisfied in knowing what happens. I'd like to know about how negative densities relate to pixel values. (the scanner says 'measuring minimum density' when starting a scan, so it definitely knows about it!).
When you buy a roll of film or a developer agent, it usually comes with a sheet of paper that has some graphs on it, telling you the 'film characteristics', with min and max densities, influence of temperature and development time on gamma etc.... No such thing when you buy a (far more expensive) scanner! The only thing you get is a manual that says little more than "trust me, I know what I'm doing and you wouldn't understand anyway".
Groeten,
Vic
I don't like machines that (pretend to) think for me. I've disabled the 'auto exposure' and 'auto gamma' on the scanner, but still some things are not clear.
The scanner settings allow to choose scanning pixel depth (8 or 16 bit) and scanning mode (normal/quality). Strangely, pixel depth nor scanning mode give me any different output image. I don't seem to get real grayscale as output either: it always generates RGB-images (24bit) with slight differences between the three channels.
What I would like is a real 16 bit grayscale output, but is that common in consumer-level filmscanners?
The scanning mode is a complete mystery. The manual says: "quality" mode may take somewhat longer and will give better... quality (yeah, right).
Another setting that you make is selecting film vendor and type. I wonder what this affects. Is it only because (talking b&w) different vendors have different base density and color cast? Or is also the actual image processed differently?
Still, I'm not fully satisfied in knowing what happens. I'd like to know about how negative densities relate to pixel values. (the scanner says 'measuring minimum density' when starting a scan, so it definitely knows about it!).
When you buy a roll of film or a developer agent, it usually comes with a sheet of paper that has some graphs on it, telling you the 'film characteristics', with min and max densities, influence of temperature and development time on gamma etc.... No such thing when you buy a (far more expensive) scanner! The only thing you get is a manual that says little more than "trust me, I know what I'm doing and you wouldn't understand anyway".
Groeten,
Vic