Sight Is More than Having Eyeballs
The brain is integrally involved with eyes to make vision meaningful and responsive.
A computer is useless without software. In the same way, the “hardware” of the eyes does not see anything. It’s the integration of the eyes with the brain and with programmed processes that allows us to make sense of the visual world—including reading this article right now. Here are a few of the fascinating ways this interaction works, as revealed in new scientific discoveries.
Circuit in the Eye Relies on Built-in Delay to See Small Moving Objects (Science Daily). A programmed delay keeps us from getting overwhelmed with detail when moving our heads. “When we move our head, the whole visual world moves across our eyes,” the article says. “Yet we can still make out a bee buzzing by or a hawk flying overhead, thanks to unique cells in the eye called object motion sensors.” The programmed delay ensures that “information from the central field of view and from the periphery arrive at the object motion sensor at the same time.” All this involves specific cells, proteins and genes at the molecular level, but that’s why you can detect the difference between a hawk flying slowly and the clouds in the background.
Neuroscience: Tiny Eye Movements Link Vision and Attention (Current Biology). You try to hold a camera steady, but eyes are not like cameras. “A new study shows that the tiny eye movements we make while holding our gaze on a point of interest are associated with brief, attention-like changes in the sensitivity of visual neurons,” Adam P. Morris writes. These movements, indeed, are essential to vision, even though we are unaware of them.
Textbooks sometimes use the analogy of a camera to teach students about human vision. Although the analogy has value, it encourages the false notion that our brain constructs our visual experience from still images of the outside world. The brain’s cameras — the eyes — are never truly stationary, even when we feel that our gaze is locked on a point in the visual scene. As a result, the input to the brain is a jerky, drifting, and disjointed image stream. How does the brain make sense of this input? A study by Chen et al. published recently in Current Biology suggests that a class of tiny eye movements known as ‘microsaccades’ are closely linked with mechanisms that prioritize how visual information is processed over space and time. Recording from single neurons in alert macaque monkeys, the authors show that neurons in the frontal eye fields and superior colliculus become especially sensitive to visual input just before the onset of these tiny eye movements (Figures 1A,B). Moreover, this enhancement is spatially specific — albeit coarsely — such that the region of the visual field that is prioritized depends on the direction of the eye movement (Figures 1C,D). These changes in visual sensitivity resemble those seen in experiments that manipulate visual attention. This suggests that, even at very fine temporal and spatial scales, sensory and oculomotor systems act in concert to coordinate visual processing.
Morris has no explanation for how this arose, except to say in passing it makes for a nice “evolutionary strategy for vision in primates” (see “sophoxymoronia” in the Darwin Dictionary).
How the brain can stop action on a dime (Science Daily). Don’t take for granted your ability to slam on the brakes when seeing a red light. Rapid response is critical for everyday function. Neuroscientists at Johns Hopkins identified the part of the brain that makes this possible: the basal forebrain, a part of the brain mostly known for regulating sleep. Counterintuitive as that sounds, the researchers pinpointed the area during experiments with rats offered rewards in response to flashing lights. “Understanding how these cells are involved in this form of self-control expands our knowledge of the normal brain circuits involved in everyday decision-making,” one of the researchers explained, “and will be absolutely critical to developing future treatments and therapies for diseases and disorders with impaired reactive inhibition as a symptom.”
Shifts of Gamma Phase across Primary Visual Cortical Sites Reflect Dynamic Stimulus-Modulated Information Transfer (PLoS Biology). This technical paper relates “complex and flexible behavior” to the individual neurons that fire in the visual cortex. “By recording neural activity and measuring information flow between multiple locations in visual cortex during the presentation of Hollywood movies,” they say, “we found that the arrangement of the phase of gamma oscillations at different locations indicated the presence of waves propagating along the cortical tissue.” As they watched the watchers, they found that “the propagation of gamma oscillations may reconfigure dynamically the directional flow of cortical information during sensory processing.”
We’ve all got a blind spot, but it can be shrunk (Current Biology). Perhaps you’ve done the finger experiments that reveal your blind spot. Since the optic nerve contains no rods or cones, its exit from the retina leaves a portion of vision absent. You would be aware of holes in the visual field if your brain did not “fill in” those spots with similar details detected by the retina surrounding those spots. For what it’s worth, Australian researchers found that you can reduce the size of the blind spot by about 10% with training, but it only works with one eye at a time. What’s worthwhile, Science Daily notes, is that similar training might help those with macular degeneration partially compensate for lost vision.
Attentive tracking of sound sources (Current Biology). Just as the brain can make the eyes focus on something interesting in the visual field, it can tune the ears to pay closer attention to a sound in a noisy environment. We’ve all experienced the “cocktail party problem,” trying to focus on a friend’s voice in a noisy room. Researchers found that even when nothing about a sound is kept constant (timbre, pitch, or semantics), participants were able to “track sound sources through feature space with a movable focus of attention.”
Update 9/23/15: Like a foreman, brain region keeps us on task (Science Daily). “If you sometimes feel like you have a little foreman in your head who keeps you on track while you work step-by-step through a sequence of tasks, you aren’t far off,” this article begins. “In new research, Brown University scientists report evidence that a particular part of the brain is responsible for exactly that function.” This everyday task of internally monitoring our inputs is routed through a network called the “rostrolateral prefontal cortex (RLPFC), an area of neurons situated in the front of your brain.” It’s the first step in cognitive control of our actions. “The health consequences are big” when this area is disrupted.
“The seeing eye, and the hearing ear, the Lord has made them both” (Proverbs 20:12).
Do you see why studying creation in detail promotes worship of the Creator who designed these things? That’s why we report them. They’re not just sterile facts. They’re realities that call for a response. The right response is exhibited in Psalm 104, Psalm 111:1-4, and Psalm 139. The wrong response is to be unthankful in spite of the clear evidence for design (Romans 1:18-21).
Jeremiah condemned “foolish and senseless people who have eyes, but see not, who have ears, but hear not” (Jeremiah 5:21, cited by Jesus in Mark 8:18). Eyes were created for seeing (and understanding); ears were made for hearing (and perceiving) the truth about God (Isaiah 6), because their design points to Him.
You can respond rightly today. Come to the light (John 3:16-21). Repent of your unthankfulness. Open your eyes, and glorify God for your body, and with your body. Then the light will fill your whole being with joy (Matthew 6:22-23).
Heaven above is softer blue,
Earth around is sweeter green;
Something lives in every hue
Christless eyes have never seen:
Birds with gladder songs o’erflow,
Flow’rs with deeper beauties shine,
Since I know, as now I know,
I am His, and He is mine.
—George Wade Robinson