sally3.gif
blingee by L.M.

Neuroscientist Gerald Edelman (who works on robotic consciousness) explains why brains and computers are not analogous.
To function, a computer must receive unambiguous input signals. But signals to various sensory receptors of the brain are not so organized; the world (which is not carved beforehand into prescribed categories) is not a piece of coded tape.
Gerald Edelman, Second Nature: Brain Science and Human Knowledge, 2006. pg.21

- sally mckay 7-08-2010 3:23 pm

For some values of "computer". At the level of individual neurons, a brain is as unambiguous as a (digital) computer- a neuron is either on or off. It's a question of granularity and quantity. Throw enough bits at any input and you can render the world in far more shades of grey than the ol' brain, and organize that data in whatever level of ambiguity you like. A cruise missile does not see the world as "coded tape" any more than a bullfrog does. Which is *not* to say that the missile "thinks" in the way the brain does, but that old "ones and zeroes" nonsense belongs in old sci-fi movies. The ones where the computer explodes if you feed a paradox into the punched tape.
- rob (guest) 7-14-2010 4:27 am


Thanks for this clarification Rob. I think Edelman would agree with you (robotic consciousness is his business after all). The point I'm trying to make here is not that computers are limited and reductive but that they don't work very well as models for human consciousness. I think the brain still has a lot to offer computer science, but the usefulness of the analogy has almost run its course for consciousness studies.

That said, I might quibble a bit about the idea that neurons can be described as digital inputs. For one thing, there's a lot of chemistry going on. More importantly, there are too many factors influencing consciousness to take into account besides structure, the big one being the ongoing, infinite variability of interaction with the environment (including cultural knowledge). This means that you can't simulate human consciousness without simulating the world the human has evolved into (and a human body to interface with it). Edelman goes on to say:

Second, the brain order that I have briefly described is enormously variable at its finest levels. As neural currents develop, variant individual experiences leave imprints such that no two brains are identical, even those of identical twins. This is so in large measure because, during the development and establishment of neuroanatomy, neurons that fire together wire together. Furthermore, there is no evidence for a computer program consisting of effective procedures that would control a brain's input, output, and behaviour. Artificial intelligence doesn't work in real brains. There is no logic and no precise clock governing the outputs of our brains no matter how regular they may appear.
While it makes sense in theory that if you just get enough granularity and quantity you can simulate human consciousness, in practice it's not feasible. The problem is maybe a bit like Borges' map.

- sally mckay 7-14-2010 3:21 pm


I used to love those old sci-fi movies where the computer-robot head asplodes when confronted with paradox.

Like that episode of the original Star-Trek famous for the phrase "ugly bags of mostly-water".

(VB via SM)
- sally mckay 7-14-2010 7:36 pm



- bill 7-14-2010 9:06 pm


I never suggested for a moment that it was feasible to simulate consciousness, just that the argument that "computers can't deal with unambiguous signals" is a weak and old-fashioned way of looking at computers. Just watch those creepy "little dog" robot videos if you have any doubts about that. The problem is not that nerve cells are not like chips (as you say there is a bit more going on, such as the difference between action potentials and graded potentials) but that stuff is well understood on the neuron level, and has been modeled for decades.
The problem is not that we can't build a computer brain, it's that we can't really frame what a "computer" is. The brain *is* a machine. To say anything else is to invoke magic. Just because we have no idea how to hook it up and program it doesn't mean that there's a ghost in there, just that we're too stupid to figure out the manual. And I agree, we might need a better tool than the one we're trying to replicate to solve the problem.
- rob (guest) 7-15-2010 5:03 pm


..."the argument that 'computers can't deal with unambiguous signals' is a weak and old-fashioned way of looking at computers."

Point taken! I see what you're saying.

And just to be clear I am 100% onside with the demystifying, no-magic approach to brains. It's all normal matter and material processes. But there's other words besides "machine," like "organism," that have non-Cartesian implications. And "machine" itself is hardly a neutral term, since it can so easily invoke a whole science fictional buzz factor.
- sally mckay 7-15-2010 6:52 pm





add a comment to this page:

Your post will be captioned "posted by anonymous,"
or you may enter a guest username below:


Line breaks work. HTML tags will be stripped.