AA Filters Really Aren't Good.
AA Filters Really Aren't Good.
jlw said:
This whole mystique of the absence of an anti-aliasing filter is something that leaves me a bit dubious
There is no reason to be dubious.
Information Theory (see, “Probability Theory: The Logic of Science” by E. T. Jaynes and G. Larry Bretthorst) tells us that whenever we modify, alter or filter data, the altered data can not yield estimates for the parameters of interest that best represent the true, but unknown parameter values. This means competent post-processing data manipulation is always better than competent pre-acquisition data manipulation. By the way, the parameters of interest are the frequencies and amplitudes (photon count) of the light falling on a specific point on the digital sensor.
Simply put, Information Theory says whenever possible avoid distorting the data before you collect it. This is not a subjective issue. Photographers know this intuitively, and we purchase the best lenses we can afford to minimize distortion before the image is recorded. Likewise our empirical experience tells us, raw image processing is superior to JPEG image processing.
Sometimes pre-acquisition data modification is unavoidable, but that doesn't make it any less destructive. The best physical anti-aliasing filter known to mankind still degrades the data.
..Buckets of stuff deleted
jlw said:
In other words, Sean seems to feel that "real" sharpness (Leica) is much better than software-generated "fake" sharpness (Epson)... but that software-induced "fake" low noise (Leica) is just as good as "real" low noise (Epson)! Huh?
There is no contradiction.
Information Theory says real sharpness is always better than software generated sharpness.
The symmetry jlw supposes between physical solutions and software solutions does not exist.
Software can not create noise. The noise is.
Software can (and does) create artifacts in the form of a residual signal.
In signal analysis, noise is the random error in a parameter estimate. Empirical noise is almost random. When we say noise, we mean those errors in the parameter estimate that are random. ISO noise is similar to the noise we hear from audio amplifiers. When the gain increases, the thermal noise increases.
Aliased signals are not noise. Alias signals can be huge when the noise level is low. Aliased signals are artifacts. Alias artifacts are not random. Non-random phenomena can be modeled. Aliasing of spatial frequencies is well understood. Post-acquisition we may be able to model aliasing artifacts down to the level of the noise, and then we can subtract them from the image. But no model is perfect and the post-acquisition calculations used to model the artifacts probably rely on approximations. So, when a post-processing algorithm models the aliasing artifacts and subtracts them from the data, the result is the signal we want, plus a residual, plus the noise. The "software-induced fake noise" is actually a software-generated residual. The noise is always just the noise. The post-processing residual is fake in the sense that it has nothing to do with the parameters of interest. The pre-processing residual is a real in the sense that it is a measure of the anti-aliasing filter efficiency. Yet, there is no reason why the amplitude of the post-processing alias-model residual can't be similar to the amplitude of the residual from imperfect pre-acqusition filtering. And, their similar amplitudes does not contradict Information Theory.
Earlier I stated the parameters of interest are the frequencies and amplitudes of the photons captured by the sensor. It turns out light also has a phase.
Will a high quality lens produce less phase distortion at the sensor than a low quality lens?
Does a physical anti-aliasing filter distort the phase?
Do Bayer-type digital sensors capture any information about the phase?
Do Foveon digital sensors capture any information about the phase?
Does a film emulsion capture any information about the phase?
In fact, is there any useful information in the phase?