How we change what others think, feel, believe and do
Eye Dimensions, Data and Statistics
Here are a set of dimensions, statistics and numbers about the eye.
Approximate dimensions in the eye include:
Given that the rods receive only black-and-white signals, and that colour perception is just in the cones, which are mostly in the fovea, we only see color in our central area of focus. The brain very kindly fills in the illusion of color in the rest of our field of vision. A similar effect happens with the more densely packed cones in the foveola, where we see high-resolution color.
About half the nerves in the optic nerve carry foveal information. The rest cover all of the rods in the retina.
Rods detect luminosity and not color. They are 'black and white' sensors. Cones detect colour.
There are about 120 million rods, which are about 100 times more sensitive to light than cones (which makes them useful for night vision and explains why things at night seem to lose their color).
There are about 7 million cones.
The angle range that the eye can see approximately:
With both eyes working together, we can see approximately:
Cameras are measured in Megapixels (Mpx). The eye actually works in angle, which is why we can't see detail at a distance. It can resolve to about 1/100th of a degree. For a picture at arm's length in front of us that fills our field of view would need 576Mpx.
File size would depend on colour space -- the eye does significantly better than cameras on this. Assuming 16bit per RGB channel, this means 3,456Mb per image. We resolve time to about 1/10th of a second, which seems to mean about 34Gb per second of data! In reality we cope with far less. Our brain, however, fools us as our eyes actually capture far less. The fovea uses 7Mpx, with the whole of the rest of the eye needing only 1Mpx.
The data rate is the speed at which information is passed. There are two data rates for the eye: the arrival rate from the world and the transmission rate to the brain.
Data arrives at about 100 Mb per second. If this is translated into 8-bit sRGB, where each of red, green and blue is separated into 8 bits each, then it represents about 30 megapixels of information.
The retina can transmit about 10 million bits per second to the brain. The effective total amount of information is more than this as it uses forms of compression and allows the brain to fill in detail. Neurons are capable of firing once per millisecond, but this speed is not achieved simply because of the energy requirements that would be needed. The brain is 2% of our body mass but uses 20% of the body's energy. Increasing visual information would be nice, but it is simply too expensive.
The blind spot is at the back of the eye, where the optic nerves gather to be fed back to the brain. There are no vision cells there and we can not see in that area, though the brain kindly fills in the area in our perceived field of vision.
When looking towards the horizon, it is:
Size-wise, it is:
The eye has a static contrast ratio of about 100,000 to 1. This equates to about 6.5 f-stops on a camera.
The eye can detect a luminance range of one hundred trillion (about 46.5 f-stops). This ranges from one millionth of a candela per square meter to one hundred million candelas per square meter.
The dimmest light that can be detected across a wide field of view is one millionth of a candela per square meter.
The hyperfocal distance of a camera lens, is the shortest distance away where the lens must be focused so the horizon is also in focus. For the eye, this is about 6m. In other words, if you look at something 6m away, everything to the horizon is in focus. But if you look at things closer, the horizon will not be in focus. When you look really close, much beyond where you focus is out of focus. This is because the eyes are relaxed at 6m, but start to cross as you focus closer and closer.
The brain's process of seeing is highly energy-intensive, taking around 40% of the body's at-rest use of calories. This is partly because something like 60% of the brain is involved in interpreting visual signals. Having said this, much of this part of the brain is also use for other functions. The percentage of the brain that is dedicated to full-time vision is more like 20%.
Overall, if something happens, the eye has a response time of about 100 milliseconds. In other words, our brain recognizes what we see a tenth of a second after it happened. This would seem to be a problem for such as catching balls and synchronizing conversation, but our brains helpfully iron all this out so it seems as if we see things instantaneously, as they happen.
Regarding the speed at which people can see things, we can interpret visual cues in as little as 13 milliseconds. Our best reaction time for such as catching balls and pressing buttons, however, is about 250 milliseconds (a quarter of a second).
Game designers often design for idea latency of 50ms. 100ms is considered noticeable and 300ms unplayable.
When interacting with computers, any response time of the computer of less than a second is fast enough to seem pretty much instantaneous. Anything more than two seconds is likely to start losing attention.
This gives some useful data about the ability and limitations of the eye. They are of interest when considering photography and perception of visual images.
Note also that the figures given come from a range of sources, some of which could be challenged. However, they do at least give a useful indication of the amazing information processing that the eye processes carries out in real time.
And the big