A question of Optics, Light, and Inverse Square Law

HankOsaurus

Member
Local time
3:11 AM
Joined
Jun 16, 2005
Messages
21
Hello Forum.

I need your help in trying to understand something about light that puzzles me.

I have observed that the inverse square law makes sense and works in any case where a light source is illuminating a subject plane. Whether that light source is the sun, a lamp, a flash, or even a lens, I can see that extending the light source to recording surface distance reduces the strength of the light in accordance with the inverse square law.

I can see how it is that a point source of light has diverging rays which are more widely spaced as they go away from the point source of light. Again, I have no trouble seeing how this agrees with the inverse square law.

Now, here's the puzzling thing I need help in understanding:

When we meter a subject plane, and decide upon its correct exposure, it matters little whether we are close or far when we take the picture. That subject appears the same brightness if we are close or far (atmospheric conditions aside).

I can imagine that the exposure for a picture of the moon's surface, taken from the moon, would be about the same as if it were taken from the earth, if atmospherics were not an issue. In other words, if the correct exposure on the moon's surface were determined to be f11 at 1/500th second, it would not matter if the picture were taken from the moon's surface, orbiting sixty miles above it, or from earth orbit. The proper exposure for the surface of the moon would still be f11 at 1/500th second.

How is it that the inverse square law continues to work, even though the subject to camera/film distance varies so widely. Surely the rays from every point in the subject plane are diverging. Why is the subject plane not dimmer with distance? Why is it that if we photograph a gray card a foot away or 500 feet away, its exposure is the same?

I know some of you folks must know this aspect of physics and optics, but it is confusing to me. Your help would be appreciated more than a little.

:bang:
 
Last edited:
Because the moon is not a point source. It is an extended source.

So if we photograph the moon at two distance (we have spaceship), while the distance to the moon changes by a factor of two, so does the image size. So as we double the distance, the light falls off by 4x, but light is being focused into 1/4 of the area. So the image does not appear to get darker as the per unit area at the image plane has the same intensity of light being projected onto it.

This will not work for point sources as a the image of a point source does not have a dimension. Stars that are further away get darker.
 
Illumination:

For point source: E = I/d^2 (Inverse square law)

For linear source: E = I/d

For extended source: E is invariant with d

Where E is illuminance produced by the source, I is intensity, and d is distance.
 
Thanks, Finder.

So if we photograph the moon at two distance (we have spaceship), while the distance to the moon changes by a factor of two, so does the image size. So as we double the distance, the light falls off by 4x, but light is being focused into 1/4 of the area. So the image does not appear to get darker as the per unit area at the image plane has the same intensity of light being projected onto it.

That put it very clearly to me. I really appreciate it.

:)
 
Back
Top Bottom