can someone explain filters to me?

Of course, if you scan your photo's it is possible to do ANY filter -except polarizer- by extracting colour channels and mixing them -provided your slides have enough detail in the highlights and are properly exposed or even slightly underexposed.
 
jaapv said:
Of course, if you scan your photo's it is possible to do ANY filter -except polarizer- by extracting colour channels and mixing them -provided your slides have enough detail in the highlights and are properly exposed or even slightly underexposed.
I absolutely refuse to get into an argument about this, so I will state the fact but will not respond further.

You cannot, by that method, obtain the effect of a red filter whose cutoff wavelength is longer than that of the red channel. This means that you're stuck with old fashioned optical filters for deep reds. Whoever started that myth about doing any colored filter with the channels did not understand the mixing properties of colored light.

Think about it. Most infrared filters have a cutoff wavelength within the visible reds. If it were possible to do any colored filter with the channels, then it would also be possible to do an infrared filter, which would avoid a lot of expense and hassle for IR photographers. However, you can't do it for the reason stated in the above paragraph. It sure would be nice, but it doesn't work that way.

Richard
 
This is a great thread for reference, but it doesn't address a question of mine.

Are filters needed for low light or indoor B&W shots? Part of the reason that I'm getting a RF camera is for it's low light capabilities. Obviously a filter will add contrast, but it will also take a stop or two of precious light...
 
I read somewhere that the function of filters in B&W photography is not only to increase contrast but to also correct for weird effects that UV light creates with the film - can cause the sky to be blown-out and other issues.

I was thinking that you would go without the filter at night and indoors, but I just wanted to be sure...
 
SuitePhoto said:
I read somewhere that the function of filters in B&W photography is not only to increase contrast but to also correct for weird effects that UV light creates with the film - can cause the sky to be blown-out and other issues.
Not much UV can get through modern lens coatings. Blown skies are usually caused by the film being too sensitive to blue light, hence the use of yellow, orange, or red filters to darken the sky by reducing the amount of blue light reaching the film.

Richard
 
Here's the secret to filters--you always keep one on the lens so when you drop it===the filter takes the heat. Lot easier to replace a $40 filter than whatever you paid for the lens.
Paul
 
Crickey, where did you find this ancient history Philipp? You could also mention the lens kit, which is the main absorber of UV, aptly named "Absorban" by Leica.

Rereading the thread I found the interesting post by Richard, which I should have answered.
Of course you cannot do an IR filter in photoshop, most if not all films, except specialised ones are not even sensitive to IR, and nearly all digital cameras are filtered to cut IR off.So the data are simply missing. I was talking about red,yellow and green.
 
richard_l said:
I absolutely refuse to get into an argument about this, so I will state the fact but will not respond further.

You cannot, by that method, obtain the effect of a red filter whose cutoff wavelength is longer than that of the red channel. This means that you're stuck with old fashioned optical filters for deep reds. Whoever started that myth about doing any colored filter with the channels did not understand the mixing properties of colored light.

Think about it. Most infrared filters have a cutoff wavelength within the visible reds. If it were possible to do any colored filter with the channels, then it would also be possible to do an infrared filter, which would avoid a lot of expense and hassle for IR photographers. However, you can't do it for the reason stated in the above paragraph. It sure would be nice, but it doesn't work that way.

Richard

Well, I am not looking for a fight, but I need to comment. No, you cannot extract an infrared photograph from the color channels for a good reason, but not because of the spectral response in a CCD is unable to detect infrared; it can. The problem is the signal from infrared is too weak and over powered by the visible red and you cannot separate the infrared from the red as they occupy the same channel. This is also true for infrared film that is shot unfiltered - it looks very normal because the film is more sensitive to other wavelengths; the filter is need to supress those wavelengths.

Filters can give more control over which wavelengths are recorded, but since a CCD has a tri-color filter, you certainly can immitate color filters with monochrome images by manipulating color channels. Which is better is up to the person.

This is maybe what you meant, but I did not understand what your post was about.
 
jaapv said:
Of course, if you scan your photo's it is possible to do ANY filter -except polarizer- by extracting colour channels and mixing them -provided your slides have enough detail in the highlights and are properly exposed or even slightly underexposed.
I'm happy to argue this one....

That's not correct.

Imagine two objects - one reflecting equal amounts of (just) red and green light, and the other reflecting only a very narrow spectrum halfway between the red and green parts of the spectrum - yellow.

Your eye will see both objects as yellow since the light reflected by both stimulates the both red and green cones in your eye. (The first because the red light stimulates the red cones, the green light the green cones. The second object because the yellow narrow wavelength light excites both red and green cones equally.

For identical reasons, a digital camera sensor will pick up the light from both in both the red and green channels. Hence both are 'yellow'.

Now place a narrow bandwidth yellow filter over the lens (or in front of your eye.) This will filter out (for the sake of argument) all the red and green light from the first object, but none of the yellow light from the second. The first object will go black, to your eyes (and to the camera) while the second will stay light (can't really say "white" because it's behind a yellow filter, so everything's yellow.)

There's no amount of postprocessing that will match the action of the filter in changing the relative brightnesses of the two objects.

And that's because recording the scene three times simultaneously from behind a red, green and blue filter (like your eye, or a camera) is emphatically not the same as recording the full spectrum of each part of the image.

So much for the theory. You photographic experts will have to come up with real-life situations in which this might be an issue.
 
It sort-of does.

But it misses this point: that two objects can have totally different spectral responses, even though they appear under a certain light to be the same colour. That being the case, a camera will record them to be the same colour, and no amount of post-processing will separate them. However, change the light - or, equivalently, change the filter - and because of the difference pigments or dyes in the objects they will no longer appear to match. And suddenly the camera will record them differently too.

For exactly the same reason, it's possible to have two fabrics that match well in daylight, but put them under a fluorescent tube and they differ markedly in colour.

Finding a value for the red, the green and the blue in an image doesn't tell you anything about how that same scene appears under different filters.
 
Last edited:
I'm not arguing this one, as I am strongly in favor of using real filters anyway. Whatever the merits of the above arguments (including my own) I have come to the insight that one loses dynamic range in the colors electronically filtered in any case.
 
Last edited:
Finder said:
...The problem is the signal from infrared is too weak and over powered by the visible red and you cannot separate the infrared from the red as they occupy the same channel...
Precisely.

Richard
 
ESG said:
It sort-of does.

But it misses this point: that two objects can have totally different spectral responses, even though they appear under a certain light to be the same colour. That being the case, a camera will record them to be the same colour, and no amount of post-processing will separate them. However, change the light - or, equivalently, change the filter - and because of the difference pigments or dyes in the objects they will no longer appear to match. And suddenly the camera will record them differently too.

For exactly the same reason, it's possible to have two fabrics that match well in daylight, but put them under a fluorescent tube and they differ markedly in colour.

Finding a value for the red, the green and the blue in an image doesn't tell you anything about how that same scene appears under different filters.

I don't understand. What has matamerism got to do with this problem? The spectral response of the film can also cause a match or mismatch simply because it is not the same as the eye.

Color contrast filters used in black and white photography don't isolate single wavelengths. The idea is to suppress fairly large areas of the spectrum of the image. If I use a filter that covers the same spectral response as one of the color channels, I can immitate it electronically. No big deal. For the most part, I can make changes to image contrast electronically or with filters. Whether they are "excalty" the same does not matter - you probably cannot tell looking at an image. If I get the result I want, why does the process matter?

Now some filters cannot be reproduced, infrared, hydrogen alpha, etc. But those are very specialized filters. As far as metemerism is concerned, that has nothing to do with this as you would first have to prove the film has the exact same response as the eye and CCDs don't, not to mention that filters would make a noticable difference as they are most likely not narrowband and would encompass the spectral response of the object. But even then, matamerism is not an issue with monochromatic images. (BTW, film and the eye do not have the same spectral response.)

Naturally, the electronic way could be argued "better" as the entire color information is preserved and the image can be adjusted at home using all color channels. This is something you are not going to be able to do with B&W film. The choice is up to the photographer.

Now, if this is just a film vs. digital argument, it is weak. Actually, it can't be supported. However you want to control tonal contrast, either method is fine and one is not "superior" to the other. All this comes down to is personal preference.

- From a film photographer who uses filters and prints in a real darkroom.
 
ESG said:
I'm happy to argue this one....

That's not correct.

I don't know enough about the subject to convince you in this, I'll happily leave that to others, like Finder, but I can say that if I look at the red channel in B&W it looks a b****y lot like a red filter in front of my lens. I'll use that in B&W conversion, whether it is theoretically correct or not 🙄
 
Finder said:
Color contrast filters used in black and white photography don't isolate single wavelengths. The idea is to suppress fairly large areas of the spectrum of the image. If I use a filter that covers the same spectral response as one of the color channels, I can immitate it electronically.
Oh, absolutely. But that's not quite the same as doing ANY filter. I've got a book of lighting filters in front of me. Each one has a printed spectral response curve, and they're all slightly different. Fiddling about with the levels of red, green and blue isn't going to match the same subtle differences between colours.
No big deal. For the most part, I can make changes to image contrast electronically or with filters. Whether they are "excalty" the same does not matter - you probably cannot tell looking at an image. If I get the result I want, why does the process matter?
Doesn't matter at all, of course.
finder said:
Now, if this is just a film vs. digital argument, it is weak.
LOL, No, definitely not one of those arguments!

I just wanted to point out to someone who asked about filters that adjusting three channels - red, green and blue - after taking an image - isn't equivalent to applying a filter to the light reaching the camera before recording that image. And the easiest thought experiment you can do to see why not is with a narrowband filter and metameric objects (thanks for the pointer - wasn't aware of that usage of the word.)

If you can get the results you want with post-processing (I can too) then all well and good. But there might be some things you still want to consider using a filter for. Am I wrong?
 
Last edited:
Back
Top Bottom