
Eastern cottontail, copyright 2017, Chris Madson, all rights reserved
SOMEWHERE IN THE BACKGROUND OF THE ONGOING DEBATES OVER THE RELATIONSHIP between humans and artificial intelligence, there’s a question that never seems to die: If a machine should achieve consciousness, will it merit some sort of special ethical consideration?
The underlying issue has been a source of consternation for well over a century as the argument over animal rights has ebbed and flowed. Are other living things conscious, and, if they are, do those beings deserve special moral standing?
This was an easier question to answer when we were content to extend the bright line just far enough to include chimps and possibly gorillas, but, as our observation of other species has broadened and deepened, that line has moved, blurred, and, finally, disappeared. A growing body of evidence suggests that even plants are aware of their surroundings, communicate with each other, defend themselves against damage, and anticipate threats to their wellbeing.
If, by “consciousness,” we mean the ephemeral condition of “mind” we possess ourselves, then it’s clear no other organism can possibly be conscious. But, if we take the behavior of other species as evidence of a level of sentience and awareness, even foresight, then the definition seems broad enough to include most, if not all, other living things. In fact, it raises the possibility that a machine could someday achieve something that looks and sounds like consciousness.
Alan Turing was the first computer engineer to consider the possibility that a machine could be conscious. His yardstick, later dubbed the Turing Test, didn’t offer a definition of what consciousness might be; it simply allowed for the possibility that, whatever it was, a machine might eventually offer a plausible imitation. In recent years, other engineers have offered revisions of Turing’s approach, never offering a workable definition of consciousness but confidently assuring us they would soon build machines that had it.
Meanwhile, generations of scientists and philosophers have continued the human struggle to understand consciousness . . . with little success. We haven’t been able to grasp its essential nature; we haven’t been able to find its seat in the human anatomy, still less in the structures of many other organisms that show unmistakable signs of consciousness; we’re not even certain how to recognize its outward manifestations. Michael Pollan’s recent book, A World Appears, bears witness to our perennial confusion about consciousness. Like quality in art, we don’t know what it is, but we know it when we see it— or, rather, feel it.
I find it unlikely that we can create what we can’t even define. It’s far more likely that we will find a way to fool ourselves into believing a collection of circuit boards has somehow developed a “mind,” even a “soul.”
Our ever-expanding catalog of consciousness in other organisms demonstrates that many other living things, structurally simple and complex, have developed some version that allows them to collect information about their environment, assess that information, and act in a way that enhances their chances of survival. If that’s true, then two observations emerge: first, that consciousness as we see it today is the product of at least four billion years of rigorous selection and specialization; second, that consciousness is probably a universal characteristic of life on earth.
A machine can never join that planetary fraternity. No matter how immense its processing speed and complexity, no matter how universal its grasp of data, it cannot live. If and when it ceases to function— when it “dies”— its constituent parts won’t feed into the organic processes that drive the world’s natural cycles. It does not share an organic heritage or contribute its genetic material to a future shaped by organic evolution. It is, and always will be, a machine— its structure, its analytical capabilities, its essence based on the fallible ideas of a handful of engineers with no grasp of the nature of consciousness.
We owe an ethical debt to the rest of life on earth, partly because we depend on the biosphere in its entirety and partly because there is reason to believe that all the children of earth bear some flavor of that ephemeral essence we call consciousness. We owe nothing to our machines. They are our tools, not our brothers.
Leave a Reply
You must be logged in to post a comment.