Wednesday, August 09, 2006


The most interesting field of debate these days is the one about the nature of consciousness. As I’ve said often enough, I believe I can almost imagine I’m a robot (at the deepest chemical levels) and consciousness is no more than the brain “hearing” the sense-based synapses talk among themselves, specifically using the auditory/visual systems, since some people “think” with images and some with language constructs. Consciousness is just an additional level of abstraction. The following paragraphs are by Richard Dawkins in The Selfish Gene, p. 59. He presents his idea of how consciousness evolved, with important reservations and a scientist's caution, of course.

“What about simulation? Well, when you yourself have a difficult decision to make involving unknown quantities in the future, you do go in for a form of simulation. You imagine what would happen if you did each of the alternatives open to you. You set up a model in your head, not of everything in the world, but of the restricted set of entities which you think may be relevant. You may see them vividly in your mind's eye, or you may see and manipulate stylized abstractions of them. In either case it is unlikely that somewhere laid out in your brain is an actual spatial model of the events you are imagining. But, just as in the computer, the details of how your brain represents its model of the world are less important than the fact that it is able to use it to predict possible events. Survival machines that can simulate the future are one jump ahead of survival machines who can only learn on the basis of overt trial and error.

“The evolution of the capacity to simulate seems to have culminated in subjective consciousness. Why this should have happened is, to me, the most profound mystery facing modern biology. There is no reason to suppose that electronic computers are conscious when they simulate, although we have to admit that in the future they may become so. Perhaps consciousness arises when the brain's simulation of the world becomes so complete that it must include a model of itself. Obviously the limbs and body of a survival machine must constitute an important part of its simulated world; presumably for the same kind of reason, the simulation itself could be regarded as part of the world to be simulated. Another word for this might indeed be 'self-awareness', but I don't find this a fully satisfying explanation of the evolution of consciousness, and this is only partly because it involves an infinite regress—if there is a model of the model, why not a model of the model of the model. . . ?”

No comments: