My absence was longer than planned. I won’t bore you with the details, but other projects needed my attention over the summer. (I wish I had simply been taking a break, but part of being a writer, I am discovering, is that the moment to stop will always be beyond the crest of the next hill). There are benefits to pausing something, even unintentionally. Coming back to The Pathos of Things, I feel a renewed excitement about the purpose and possibilities of the project. Let’s hope something good comes of it.
I did also find time to publish a few articles while I was away. Among them was an essay based on a new book by the anthropologist Webb Keane, Animals, Robots, Gods. This is a short, stimulating exploration of human morality in numerous cultures around the world, aiming to remind Western readers that, even in a time of globalisation, their assumptions are far from universal. In particular, it provides a fresh take on the role of technology in our lives. For Keane, machines and algorithms are just one example of the complex, intimate relationships that can occur “at or beyond the edge of the human moral world.” Others involve animals, deceased or unconscious people, and various kinds of deities.
This seems like a good opportunity to plant a flag, marking a subject I’ll be returning to in future. It relates to Keane’s reflections on cyborgs, or entities that are part-human, part-machine; “a hybrid of living being and technological device.” This concept has a science-fiction flavour to it, but actually becomes more unsettling when we consider the many kinds of hybrid we are already familiar with. What about users of eyeglasses, hearing aids and pacemakers? What about ventilators and other life-support equipment we find in hospitals?
We can normalise cyborgs still further, as Keane does when he discusses our use of tools:
a driver is a person-with-a-car, an artist a person-with-a-paintbrush, a soldier a person-with-a-gun and a surgeon a person-with-a-scalpel. Their tools make someone a driver, artist, soldier, surgeon or writer. You cannot be those people without your tools.
We might imagine that such instruments don’t count because we can simply put them down. In fact, they are more a part of us than physiological supports like hearing aids. For in each of these examples – driver, artist, soldier, surgeon, writer – skilful use of the tool requires a great deal of intellectual and emotional programming. A car makes a driver, and biomedical instruments a surgeon, because such artefacts rely on knowledge absorbed and embodied by their user. An especially clear case comes from reading and writing, which require not just pens and keyboards, but the technology of language itself.
In the modern world, of course, the growing power of technology has changed the ways that devices are integrated into our lives. On the one hand, as machines continue to displace human skill, drivers, artists, soldiers and even surgeons are less defined by their mastery of particular tools. At the same time, we are all becoming dependent on computing technology for everything from navigation and communication to memory and, to a significant degree, thought itself. The question, then, is how these newer forms of technological enhancement differ from the older ones. If I use algorithms to monitor my sleep, plan my diet, and curate my selection of news, music, and romantic partners, am I doing something meaningfully different from using a car to drive, or an alphabet to read and write?
There are some who think we should treat all technologies as a single continuum. In my original essay on Animals, Robots, Gods, I contrasted Keane’s ideas with those of Ray Kurzweil, principal researcher and “AI Visionary” at Google. (Yes, that is a job title, at least according to his most recent book, The Singularity Is Nearer). For around thirty years, Kurzweil has been eagerly anticipating the moment when humanity merges with superintelligent AI, by uploading our minds to digital systems and inserting microscopic robots into our bloodstreams. He presents this as a logical continuation of our history as a species, whereby “For thousands of years, humans have gradually been gaining greater control over who we can become.”
In this view, attempting to separate the human and non-human elements of the cyborg is pointless. The use of technology to modify and enhance our natural capacities is what makes us human. This strikes me as a highly misleading argument, not to mention a dangerous one, although it certainly stands to benefit companies like Google. It easily collapses into a kind of fatalism, encouraging us to embrace every new technology that is foisted upon us as part of an inevitable evolution. That would be the opposite of (to use Kurzweil’s phrase again) “gaining greater control over who we can become.”
One can just as easily say that our humanity lies in our ability to make distinctions. We can assess different tools and techniques and decide for ourselves what role, if any, they should play in our lives. The fact that some tools facilitate competence and skill, while others diminish them, is just one important consideration. Likewise, even if the boundaries between human and inhuman can be blurry and contingent, we may decide they are still worth drawing. As the possibilities of technology continue to grow and change, it will only become more important that we reject claims of inevitably, and learn to differentiate on the basis of what matters to us.
Great to see Pathos going again. Tool making/use is definitely part of our obligate sentience (as organisms, we can no longer survive without it). Thoureau bemoaned men becoming the tools of their tools, but I like your proposed framework a little better—a blurred line between tool and maker/user doesn’t mean extinction/erasure of the user (nor does it tell us anything unavoidable so far about our future).
Great essay!
I've thought about this too — we've been evolving as tool-users since we were at least the aptly named Homo habilis. Tools are so ingrained in what it means to be human that we've evolved alongside them. The invention of fire, for instance, probably altered, at least, our digestive system and sleep cycles. In that sense, openness to innovation is good and normal, but the level of innovation we currently have at our fingertips presents a unique picture. Historically, innovation been relatively slow — in the case of fire, we've had a few hundred thousand years to evolve to fit its role in our lives, both biologically and culturally.
In the past, say, thousand years, the development of science and the access to global materials has resulted in a rate of innovation so rapid that we're constantly introduced to new inventions before we have time to adapt to old ones. This impressive rate is by evidenced certain historically unprecedented statistics such as low infant mortality and world population.
But the greater quantity of innovation requires us to sharpen our ability for identifying value. Unless you deontologically reject most innovation Amish-style, there's much more discernment and evaluation of tools that's necessary to identify which innovations are valuable to human flourishing, which are futile, and which are counter-productive.