Last fall I attended a conference at NYU called “Animal Consciousness.” It should have been called “Animal Consciousness?” to reflect the uncertainty pervading the meeting. Speakers disagreed over when and how consciousness evolved, what is required for it to occur and what creatures have it.
In a previous column, I focused on the debate over whether fish can suffer. Scholars also considered the sentience of lampreys, spiders, crustaceans and other species. Speakers presented evidence that creatures quite unlike us are capable of complex cognition. To convince us that octopuses are conscious, philosopher Peter Godfrey-Smith showed us a video of them goofing around while a pufferfish watched, seemingly out of pure curiosity.
Biologist Andrew Barron argued that bees, in spite of their minuscule brains, are not mindless automatons. Their capacity for learning rivals that of mammals. When harmed, bees stop eating and foraging as if they were depressed. Bees, Barron concludes, are conscious. Philosopher Peter Singer considered whether cockroaches and bedbugs can suffer, and if so whether we should treat them more humanely. At the other extreme, psychologist Stuart Derbyshire questioned whether even dogs are conscious.
Looming over these disputes is the solipsism problem. I know I am conscious, but I can’t be absolutely sure that anything else is, because I have access only to my own subjective experience. I’m pretty confident that you and other humans are conscious, because we’re so similar. But my confidence in the consciousness of non-human things diminishes in proportion to their dissimilarity from me.
As long as we lack a solution to the solipsism problem, theories of consciousness will be unconstrained and hence wildly divergent. Integrated information theory, a leading explanation of consciousness, implies that consciousness pervades all matter, including non-living things, as decreed by the ancient mystical doctrine panpsychism.
At the other extreme are so-called eliminative materialists, who question whether anything is really conscious, including humans. An advocate of this position is philosopher Daniel Dennett, who spoke at “Animal Consciousness.” In his recent book From Bacteria to Bach and Back, Dennett calls consciousness an “illusion.” He comes close to suggesting that we are zombies, beings that appear conscious but actually lack an inner life.
The solipsism problem haunted other meetings I’ve attended over the past two years. At “Ethics of Artificial Intelligence,” computer scientist Kate Devlin considered whether sexbots, robots designed to have sex with humans, might be conscious and hence deserving of rights. At a workshop on integrated information theory two years ago, participants debated whether dark energy is conscious.
Last spring, NYU hosted “Is There Unconscious Perception?” Scholars argued over the implications of conditions such as blindsight, which is caused by brain damage. Subjectively, you feel blind, but if someone throws a ball at you, you will catch it. Blindsight proves that perception and other cognitive functions need not be accompanied by consciousness, according to philosopher Ned Block, an organizer of the meeting.
Block reiterated this point at “Animal Consciousness,” which he also helped organize. Other scholars disagree with Block’s interpretation of blindsight data, contending that people with blindsight might possess visual awareness even if they insist that they don’t. That strikes me as a very weird claim. But my point is that even if you restrict your discussion of consciousness to humans, you can’t escape the solipsism problem.
As long as we can’t solve the solipsism problem, we will favor theories of consciousness for subjective reasons. You are big-hearted, so you grant consciousness to bees, jellyfish, sexbots and dark energy. I am a snob, so I restrict consciousness to humans, primates and a few especially clever birds, like crows. Your Consciousness Club is capacious, mine vanishingly small. You may think your preference is wholly rational and objective, but it is based more on taste than truth. We cannot escape our subjectivity when we try to solve the problem of subjectivity.
John Horgan directs the Center for Science Writings in the College of Arts & Letters. This column is adapted from one originally published on his Scientific American blog “Cross-check.”
Be First to Comment