what ƒ-stop = "reality" ??

reub2000 said:
To compare the exposure wouldn't you have to define a shutter speed? But the eyes don't have a shutter speed, instead continuously capturing video. So a comparison would be hard, wouldn't it?


Well we get persistence of vision in a cinema at between 14-18 frames per second, it would be safe to suppose the “shutter speed”/System refresh rate, is a little slower than that, do you think?
 
Sparrow said:
Well we get persistence of vision in a cinema at between 14-18 frames per second, it would be safe to suppose the “shutter speed”/System refresh rate, is a little slower than that, do you think?

The vertebrate and mammalian visual system is not so simple as a camera or computer, and does not have a shutter or synchronization system for the rod and cone detector cells. The rod and cone detector cells do individually have a fundamental firing/neurotransmitter release frequency, and are turned OFF by photons. So their normal state is to say "I see black, I see black, I see black..." to the bipolar cell it synapses with. When a photon is absorbed the receptor cell turns off and the bipolar cell says "we have light." In aggregate, signals from the rods and cones are processed in the eye's retina by bipolar and horizontal cells that help with edge detection and contrast enhancement processing (like high acutance developers with edge effects or unsharp mask in Photoshop). More complex processing occurs in the visual cortex at the back of your mammalian head, such as shape recognition, horizontal vs. vertical orientation detection, movement detection, focus control etc.

Funnily enough the bipolar and horizontal cells are in front of the rods and cones in vertebrates, which does cause some light scattering and loss of photons, so evolution didn't give us the best design for low light, but the vertebrate retina layer structure might offer some protection for the rods and cones against overly bright light. Squid and octopus have a more practical placement of photosensitive cells at the front layer of the retina, which is good, because sea water absorbs a lot of light and the cephalopods have a big survival advantage by having good low-light sensitivity eyes.
 
SDK said:
The vertebrate and mammalian visual system is not so simple as a camera or computer, and does not have a shutter or synchronization system for the rod and cone detector cells. The rod and cone detector cells do individually have a fundamental firing/neurotransmitter release frequency, and are turned OFF by photons. So their normal state is to say "I see black, I see black, I see black..." to the bipolar cell it synapses with. When a photon is absorbed the receptor cell turns off and the bipolar cell says "we have light." In aggregate, signals from the rods and cones are processed in the eye's retina by bipolar and horizontal cells that help with edge detection and contrast enhancement processing (like high acutance developers with edge effects or unsharp mask in Photoshop). More complex processing occurs in the visual cortex at the back of your mammalian head, such as shape recognition, horizontal vs. vertical orientation detection, movement detection, focus control etc.

Funnily enough the bipolar and horizontal cells are in front of the rods and cones in vertebrates, which does cause some light scattering and loss of photons, so evolution didn't give us the best design for low light, but the vertebrate retina layer structure might offer some protection for the rods and cones against overly bright light. Squid and octopus have a more practical placement of photosensitive cells at the front layer of the retina, which is good, because sea water absorbs a lot of light and the cephalopods have a big survival advantage by having good low-light sensitivity eyes.

To add to your biological explanation, I shall add examples of why a shutter speed tends to be undefineable for the eyes.

In a theatre, due to the motion blur inherent in the movie and the low lighting, your eyes will tend to see the flashing images as having no gaps, which is what your brain does to attempt to interpret the information.

How fast a shutter speed can you 'see'? Well if you look out from the corner of your eye at a monitor set to 60hz (rescans the screen ever 1/60th of a second), then you will notice the flashing. I can notice flashing at 72hz, but my eyes may vary from yours.

Shutter speed though, does have a semi bearing on the situation. It's more like a movie camera than a film camera because you are constantly taking 'photos' and the latent images of these 'photos' stays in the image processing center of your brain. This allows seemless movement from one 'frame' to the next due to this latent image. We do have a 'shutter speed', although it would be difficult to quantify.
 
Crasis I agree. I'm not sure, but I imagine an analog NTSC video camera might be an even better analogy for vertebrate vision. In the NTSC standard, half the pixels (every other row) is captured every 1/60th of a second, then the other half in the next 1/60s, and the alternating fields of interlaced pixels are presented as moving images. In the eye, the cells are not synchronized (at least I have not found any references that suggest they are) so little packets of asynchronous information are constantly processed in the eye, and a simplified signal is sent to the visual cortex for processing and put back together as a continuous analog moving image. Your brain then extrapolates time from visual and sonic cues. Perception of rapid events is possible, but some visual cues persist for large fractions of a second, allowing TV and Movie technology to simulate continuous motion in a relatively believable way.
 
reub2000 said:
To compare the exposure wouldn't you have to define a shutter speed? But the eyes don't have a shutter speed, instead continuously capturing video. So a comparison would be hard, wouldn't it?
Even continuous video cameras have a shutter speed. It might not make sense physically to say that the human eye has a shutter speed, but it might make more sense from a psychological perspective. If we say there is no shutter speed, then we can't analyze it at all. There would be no comparison to have between a camera and the human eye. I would think of it more as a useful illusion, not as a total misconception.
 
Sparrow said:
Well we get persistence of vision in a cinema at between 14-18 frames per second, it would be safe to suppose the “shutter speed”/System refresh rate, is a little slower than that, do you think?
It only appears to work because the movie has motion blur.
 
SDK said:
The vertebrate and mammalian visual system is not so simple as a camera or computer, and does not have a shutter or synchronization system for the rod and cone detector cells. The rod and cone detector cells do individually have a fundamental firing/neurotransmitter release frequency, and are turned OFF by photons. So their normal state is to say "I see black, I see black, I see black..." to the bipolar cell it synapses with. When a photon is absorbed the receptor cell turns off and the bipolar cell says "we have light." In aggregate, signals from the rods and cones are processed in the eye's retina by bipolar and horizontal cells that help with edge detection and contrast enhancement processing (like high acutance developers with edge effects or unsharp mask in Photoshop). More complex processing occurs in the visual cortex at the back of your mammalian head, such as shape recognition, horizontal vs. vertical orientation detection, movement detection, focus control etc.
It would then seem though, that our refresh/shutter speed is totally dependant on the neuron transmission speed within the body. So our mind gets a fresh image every batch transmission of neurons, right? Or would you say this is total wash?

Like I said, it might not make sense to actually call it a frame rate or shutter speed, but is merely useful for thinking about and making comparisons.
 
Neither the human eye nor the camera represents reality accurately. Take perspective, for example. In reality objects do not get smaller because they are farther away. And don't give me any 'yeah, but...' nonsense. 🙂

Richard
 
Yeah, Richard, but if that's true early Japanese art would look realistic, with things at all distances given the same scale. Does it look realistic to you?????
 
Dracotype said:
It would then seem though, that our refresh/shutter speed is totally dependant on the neuron transmission speed within the body. So our mind gets a fresh image every batch transmission of neurons, right? Or would you say this is total wash?

Like I said, it might not make sense to actually call it a frame rate or shutter speed, but is merely useful for thinking about and making comparisons.

It's more like you get little pieces of images continuously and asynchronously, which are then put together into a continuous "movie" by your visual cortex. There is no internal clock synching the nerves like in a computer, no 60hz reference like in an analog video camera. However, there is a limit to visual perception of rapid events that is based on the fundamental speed of the neurons and synapses involved. Humans can't easily perceive the time between 2 events if it's shorter than about 80 milliseconds. This is 1/12.5 of a second. That's not to say you won't notice some flicker at 12.5 Hertz, and even faster as Crasis noted with computer screens, because some neurons are faster and some are slower, and some are firing all the time, as the screen goes from black to image. Crisp edges enhance the ability to notice the flicker in films shot with high shutter speeds, but still shown at 24 frames per second (like the swords in the fight scenes in the movie Gladiator, though I think Ridley Scott and Co. enhanced the effect by duplicating some frames and cutting others).

A very nice, concise web page on human vision is at http://www.du.edu/~jcalvert/optics/colour.htm.

.
 
Last edited:
Back
Top Bottom