Delusion
One of the most difficult things for our brains to sort out is the optical illusion. Is it shimmering water or heated air? Do you see the white sphere? Perhaps, but it doesn’t really exist! Our brains fool us. It’s the same with the monumental risks facing humankind. We want to believe that science or technology will somehow save us, but will they?
In this series, we’re discussing existential risks, or X-risks—global threats that could ultimately cause the extinction of all humanity or, at the very least, its potential for further development.
Today we’ll look at two more of these threats. Both concern our vulnerability to being fooled. They are dangerous new technologies and self-delusion.
Peter Townsend’s expertise is engineering physics. He felt compelled to write The Dark Side of Technology: “I think I have a message to get across, which is that technology is fantastic, it’s marvelous. Without it the world would be a different place. But we’re so excited about the technology that we rarely sit back and think, Are there any side problems that we should look at or address? And I realize there are many across the whole range of technologies, from medicine to chemistry, to physics, to electronics. And in some of them, they are so serious that they could actually destroy us.”
One of the strengths—and weaknesses—of modern society is our use of satellites for communicating data of all kinds. Yet these powerful tools are extremely susceptible to damage. Of the approximately 1,500 satellites in orbit, four are breaking up each year. There are now about 500,000 fragments circling the earth.
Peter Townsend: “I’ve worked out that if you have something like a cell phone (fragment), traveling at orbital speed, the kinetic energy on impact is roughly 500 times greater than a modern tank shell. When you look at the speed they’re traveling at, an impact is going to then cause far more fragments, and this effectively could lead to a runaway destruction of satellites up there. If you take out satellites—in terms of communication, banking, everything—really big problem. And that is likely to happen over the next 20 or 30 years unless we could find some way of removing fragments from that zone of space where they have to sit. But we have no idea how to address it. We don’t know how to clean it up.”
So some of the very technologies we trust in could also be our downfall.
Martin Rees of Cambridge University is cofounder of the Centre for the Study of Existential Risk. We talked about the need for such a think tank: “The reason for doing this, really, is that there’s a huge amount of effort going into discussing small risks like carcinogens in food, low radiation doses, train crashes, air crashes and things like that, whereas people are almost in denial about these low-probability but very high-consequence threats which stem from new technology. But I would certainly say that we have at least a 50 percent risk of experiencing a very severe setback to our civilization between now and the end of the 21st century. And this will be due to the collective pressures of 9 or 10 billion people on the planet—or, I think more likely, disruption stemming from the misuse of these ever more powerful technologies by small groups—by error or by design.”
But we’ve not even scratched the surface of this discussion. For example, despite decades of managing to avoid the use of atomic and hydrogen weapons, the nuclear sword of Damocles is still hanging over us. It’s well documented that nuclear material goes missing every few weeks. Imagine the potential if those materials fell into the hands of a terrorist or a rogue nation.
Another X-risk is self-delusion. This is listed by Australian science writer Julian Cribb as an overarching existential threat. By self-delusion he means the danger that comes from trusting in a false belief. Optical illusions provide a good example of how easily we can be misinformed by our own brains.
In this well known image, referred to as Kanizsa’s Triangle, these spatially separated fragments convince our eyes that there’s a bright white triangle in the foreground. Yet no such object exists. This is a harmless visual illusion that can cause no problem in real life.
But at Apex Mountain in British Columbia another kind of optical illusion has proven fatal. The mountain has been the site of at least a dozen plane crashes in recent years. Part of the problem is an illusion that causes pilots to believe that the terrain is not as high as it actually is. By the time they realize the problem, it’s too late to climb or turn, and they crash.
Our brains can fool us by giving us faulty information. Cribb suggests that humanity has a similar view of the existential dangers we face. We can easily be in denial—a form of self-delusion. We think we can take the earth’s resources continuously without replacing them and still somehow survive. We opt for believing that technology will surely save us, or that money or political ideology is the answer. We rationalize our way out. All of this is delusional thinking in the face of global risks that are accelerating.
Peter Townsend: “People just don’t feel the need to take action. People who deal with major disaster scenarios . . . say that if you have a big threat—fear of a tsunami, for example, coming into some region—typically half the people will say, ‘It’s never going to happen to us.’ Even though the evidence is there, you’ve got the warning, the people won’t move and get out of the way.”
Martin Rees: “We are in denial about things that ought to concern us, especially these risks which could be so catastrophic that one occurrence is too many.”
It’s clear that we can’t afford to turn a blind eye to the reality of what’s happening in our world. The Roman empire of 2,000 years ago lived through a period of extended peace, the Pax Romana, created by the emperor Augustus. Some people deluded themselves that it would always be so. But as the apostle Paul wrote at the time, “When they say, ‘Peace and safety!’ then sudden destruction comes upon them” (1 Thessalonians 5:3). And the empire did fall.
Is our world any different? Technologically, yes—but more vulnerable as a result. Julian Cribb says it this way: “It is now, not in a generation’s time, that the decision to survive or fail must be taken.”