December 17, 2012 | David F. Coppedge

Eye Retina Is Analog-to-Digital Converter

Before you see, your retina has done multiple digital transformations on the incoming signal.

A new study by opthalmologists at the University of Tübingen, Germany has confirmed that the eye goes digital.  Bipolar cells at the base of the retina, long thought to send continuous analog signals to the retina, have been shown to generate action potentials, or spikes, that represent on-or-off conditions (the basis of digital programming).  The paper in Current Biology is technical; the interesting part is in the press release from the university, entitled “The end of a dogma: Bipolar cells generate action potentials.”  The article explains the advantages of digital processing:

Action potentials allow for much faster and temporally more precise signal transmission than graded potentials, thus offering advantages in certain situations.

Even more amazing is how the eye massages its digitized information for the brain.  The description sounds like a computer system:

Amazing FactsThe retina in our eyes is not just a sheet of light sensors that – like a camera chip – faithfully transmits patterns of light to the brain. Rather, it performs complex computations, extracting several features from the visual stimuli, e.g., whether the light intensity at a certain place increases or decreases, in which direction a light source moves or whether there is an edge in the image. To transmit this information reliably across the optic nerve – acting as a kind of a cable – to the brain, the retina reformats it into a succession of stereotypic action potentials – it “digitizes” it. Classical textbook knowledge holds that this digital code – similar to the one employed by computers – is applied only in the retina’s ganglion cells, which send the information to the brain. Almost all other cells in the retina were believed to employ graded, analogue signals. But the Tübingen scientists could now show that, in mammals, already the bipolar cells, which are situated right after the photoreceptors within the retinal network, are able to work in a “digital mode” as well.

The researchers were able to classify bipolar cell patterns into at least eight classes.  “Therefore, the systematic projection pattern of BCs provides distinct temporal ‘building blocks’ for the feature extracting circuits of the inner retina,” the abstract of the paper states.  Could this be analogous to bits being organized into bytes for a kind of visual code?  If nothing else, it improves reliability of the signal: “To make information transmission to the brain reliable, the retina first has to ‘digitize’ the image,” the press release began.

Why is this the end of a dogma?  Action potentials had been noticed before, but were considered rare exceptions.  “The results from Tübingen call a widely held dogma of neuroscience into question – and open up many new questions,” the press release ended.

As could be predicted, none of these articles mentioned evolution.  How could they, when talking about computer cables and digital codes?

There are two take-home messages from this paper.  One is that the closer science looks at life, the more intelligently designed it appears.  The autonomous bipolar cells cannot “know” what the brain needs.  They had to be pre-programmed to send the most valuable information so that the brain can respond appropriately.  They have to “format” the signal, using a code the brain can understand.  And they have to get it there fast; that’s why digital is the way to go.  The exquisite interaction of parts here, sending digitally-encoded information down a “cable” of sorts, is really mind-boggling.  It’s not just a camera chip; it’s a whole Photoshop!

The other take-away message is that we don’t really see what’s “out there” in the world as it really is.  What’s out there are just photons bouncing off objects.  Before we see what we think we’re seeing, a long series of intermediaries (the cornea, the aqueous humor, the lens, the vitrious humor, the rods and cones, the bipolar cells, the retinal ganglion cells, the optic nerve, and who knows what else) has massaged, transformed and formatted the signal from the initial impingement of photons on the cornea.  Philosophers can have fun with this (see David Chalmers discuss the “hard problem” of neuroscience on Evolution News & Views).  Creation apologists can also use this information as a response to skeptics who say they only believe what they see with their own eyes.



(Visited 888 times, 1 visits today)


  • mmartin says:

    This is not just analogous to a dumb analogue to digital converter – this is more akin to something like an h264 encoder engine right on the retina. But unlike these flawed human approaches this über-tech does not produce any discernable visual artifacts.
    Horrible times to be an evolutionist.

  • rockyway says:

    1. ‘As could be predicted, none of these articles mentioned evolution. How could they, when talking about computer cables and digital codes?’

    – Reality is proving too much for Darwinian language to handle. If the grand theory of M2M evolution were correct, we need to ask why it is that the theory has no adequate way to talk about what researchers are discovering.

    2. ”The retina in our eyes is not just a sheet of light sensors that – like a camera chip – faithfully transmits patterns of light to the brain. Rather, it performs complex computations…”

    – If I remember correctly, the Dawkins critique of the eye depended upon the pretense the eye was a camera, or could be compared to a camera. i.e. since it wasn’t designed as a camera it was badly designed. His now obsolete critique was based on a primitive and ignorant idea of what the eye is.

    I can’t resist adding that we have here an example of how discoveries in biology are shedding light on Design.

Leave a Reply