Reengineering Our World: A Cautionary Tale
Do we really want a man-made future? In this synthetic age, a philosopher argues for keeping nature natural.
In thinking about how we change the world, environmental philosopher Christopher Preston recalls a scene from the 1967 movie The Graduate. Pulling young Ben Braddock (Dustin Hoffman) aside, Mr. McGuire (Walter Brooke) advises, “I just want to say one word to you. . . . Are you listening? Plastics—there’s a great future in plastics!”
“Today,” suggests Preston, “if Ben were getting such advice, he would hear a much grander promise of an even more startling synthetic future. Humans are no longer just surrounding ourselves with new materials. Our species also is gaining the ability to engineer a number of key planetary processes.”
In his new book, The Synthetic Age: Outdesigning Evolution, Resurrecting Species, and Reengineering Our World (2018), Preston argues that we are not merely living in a world of plastics but that our technologies have the potential to completely refashion the world: “Big choices about the expression of our powers are upon us.” Rather than the Anthropocene, he calls our new age of malleability the Plastocene (or Synthetic Age).
“By deliberately tinkering with some of the planet’s most basic physical and biological operations, humans stand on the verge of turning a world that is found into a world that is made.”
According to Preston, with advancements such as nanotechnology, genomics and the geoengineering of climate, technology threatens to rework the earth from the ground up, so to speak, not only changing how it looks but also how it works: “Rather than impacting nature thoughtlessly and accidentally, a ‘full-throttle’ Plastocene means that we would shape it confidently, deliberately, and sometimes ruthlessly, all according to the best abilities of our technical experts. Nothing would be off-limits.”
Should there be limits? And who decides? “Those who are very enthusiastic about what technology offers will read The Synthetic Age and not even realize there’s a problem,” Preston says. “They wouldn’t understand that there is any reason for caution at all. But that view is not universal. There are plenty of people who find some sort of majesty in the world that is outside of our hands and who want to see that world continue in some fashion. It’s a difficult tension.”
Preston is professor of philosophy at the University of Montana–Missoula and a research fellow in its Mansfield Center Ethics and Public Affairs Program. He spoke to Vision science editor Dan Cloer about that tension between what we could do and what we should do.
DC I looked at your course syllabi and wondered if any of Michael Crichton’s novels were part of your supplemental reading. He certainly played out some horrible scenarios that new technologies could bring about through the altering of the natural world.
CP No, it’s not really necessary. A lot of my students are already somewhat skeptical of the technology, and they’re here in Montana because they love the natural world and they want to see it preserved. So the kind of Crichton warnings that come from a book like Prey have already been taken on board by a lot of my students, who tend to be more pro-Thoreau, Muir, and an old style of wilderness preservation. But I do try to inject a sense of caution into my classes.
DC In the first part of The Synthetic Age, you write about your moment of insight at the dock after a halibut fishing expedition in Alaska, and you see a bear eating a salmon at the shore. I thought this was going to be a story about our complex ways of catching a fish in contrast to nature’s simplicity, but it wasn’t.
CP No, this story was to illustrate the moment at which these issues become personal and begin to motivate you. Recognizing that your halibut—in this spectacular landscape, full of wildlife, and as pristine as it seemed—might not actually be safe to eat in large quantities. This was an enormous realization to me, that the whole world has now been impacted by our activities, even in that place. I wanted to say that, for everybody, there’s a moment where this becomes personal. Well, hopefully.
DC You call this time we’re in today a “thinking space” or “transitional period”; you seem to suggest that we’re not only building technologies but also in the process of synthesizing a sense of moral values. Is this about creating a deeper sense of who we are and what we’re doing in the world?
CP Everyone is throwing around this word Anthropocene—a human age, this new epoch that we’re supposedly in. But epochs last for tens of thousands of years, so I think it’s wrong to believe we’ve already embarked on this human age and to treat this as a done deal. The Anthropocene idea provides a moment to realize that our effects on the planet are global; there is no escaping that. So what responsibilities come with that reality? Does it mean that we now need to take over the management of the earth in an aggressive fashion? Or does it mean to back away?
“In this transitional Anthropocene moment, we have to make decisions about how much control we plan to exert on our world.”
In terms of the synthesis of new values, I do think a new kind of environmental ethic or environmental philosophy will emerge over the next few decades. The idea of “nature out there, untouched”—that we just need to revere it and leave it alone—is going to need some revision and reworking as we move through this transitional moment.
DC You write quite a bit about the shift from untouched to touched, and that “pristine nature” is viewed by some as gone already.
CP Here at the University of Montana we have big forestry and wildlife biology schools. The ideas of a pristine nature being over and that hands-off management is now impossible are all the rage in these schools. A lot of the forestry students also like the old style of preservation, but there’s this whole way of thinking that says “nature” has ended, and there’s no naturalness or wildness anymore, so let’s get on with a different type of landscape management.
I think one can overdo this idea. I’m concerned about flipping too far to the hands-on-management side of things.
DC If, as you say, we’ve become disconnected from the “orienting ethical hinge” of nature, what principle or philosophy will become our new basis for decision making?
CP When I refer to that “hinge,” what I mean is the world outside of us, the world outside our control; it’s the world we’re born into and live alongside and are in a constant negotiation with. That world is very valuable, and where possible, we need to maintain it.
The idea of a synthetic age is not “game over, so just embrace it.” The idea I’m conveying in the book is that, if we’re not careful, we will be living in an entirely synthetic age. I’m actually in favor of maintaining the world outside of us as that orienting ethical hinge. Otherwise it will come down to “might makes right,” or “money talks.”
DC How does this negotiation provide direction?
CP We encounter a world that’s different than us and independent from us. It’s analogous to how we move about in the social world: We encounter persons who are independent from us, never entirely under our control or subject to our designs. We learn how to negotiate a relationship with them outside of ourselves. I see this as similar to how we should care for the broader nonhuman world around us. We should treat it as independent, and that generates some respect for that world.
“Part of the point of the book is that we’re up against it now because these technologies are arriving. Not in all cases, but in some cases it’s commercial interests at work.”
DC So in that relationship with nature we also learn about ourselves, just as we do from our interactions with other people?
CP Yes, that’s correct. If we’re not careful, we’re going to lose that independent world. The book is supposed to sound a note of caution, and maybe I should have been a little louder about that.
DC I would agree. Your book gives a very balanced view and evaluation of these disruptive technologies but never gives a clear message about what we should now do.
CP When I first started on it, I wanted to write a short handbook, an introduction to some of these really unprecedented technologies. I thought that a short introduction did not need to have an agenda: I don’t need to come out as either for or against; I’m just going to lay these things out as I see them. But it grew and became more than an introduction.
Later in the process my editor suggested adding more opinion and giving the reader a better sense of me and what I’m for and against. That could be more interesting, because some readers would find a clear ally and others a clear enemy. But these technologies don’t really fit that pro-or-con dichotomy.
DC In your paper “Rethinking the Unthinkable” (2011) about geoengineering climate, you said, “Even though there is a grave risk of moral corruption when advocating geoengineering, it remains theoretically possible that it might, under the right circumstances, be the lesser of two evils.” Now in The Synthetic Age you warn that if we actually started to control climate there is a risk that we would never be able to give it up.
CP In The Synthetic Age I try to show some of the good and bad possibilities, and that we should keep all of these in mind.
But the paper you quote makes it clear that there’s a presumptive argument against geoengineering. The gut feeling that got me writing that paper was “This is bad stuff. It’s really dramatic and an unprecedented move.” So I tried to articulate the argument that we shouldn’t be interfering with this system. The cautious argument was supposed to come through as strongly as the argument that says “under some circumstances, this could be the lesser of two evils.” Advocates of climate engineering focus on the “lesser of two evils” part.
Sure, if the world was about to catch fire and you could cool it down enough by spraying stratospheric aerosols (using the Solar Radiation Management strategy), it’s theoretically possible that could be the lesser of two evils. You shouldn’t reject something out of hand or take a radical stand on one side or the other.
I know some of those who are researching geoengineering, and they’re good people with good moral arguments about people suffering in the world, who are least able to deal with climate change. I respect these researchers and their work, so I can’t just dismiss it and say that there’s no possible model or argument for doing this. Under the right circumstances there is a right moral argument for doing it. But I think the presumption should be that it’s the wrong type of management for a planet. We shouldn’t really be getting into that kind of game.
“There are high risks involved, as well as issues of human character, who we are as a species, and what we should be doing.”
DC Rather than Anthropocene, you’ve used the term Plastocene to represent our molding of the earth. Some argue that we’re never going to get things right because we’re built for short-term interests rather than long-term consequences. Do you think it’s possible for us to collectively think through what we’re doing and actually make democratic decisions concerning these technologies?
CP Yes, this is a challenge, and what I wrote in the book is kind of wishful thinking. But we need to talk about it. What’s a philosopher going to do other than put questions out there and point to issues that might be otherwise hidden, and try to get people motivated to talk about them? Obviously no one person is going to put the brakes on a big technological movement. But individuals can contribute to generating a discussion, which could create some momentum to see that the technology proceeds in a more ethically desirable way.
Let me expand on the geoengineering example. We had a PhD student, and with some funding from the National Science Foundation, we sent him to the Arctic, the Pacific Islands and sub-Saharan Africa. He interviewed people who were already experiencing climate change and asked them what they thought of geoengineering. He reported that a lot of them are reluctantly accepting that we might need to look at this. The damages are already occurring, and they want a way to escape them. They would rather there was another way, and importantly, they did not want to have their climate shaped by rich countries making decisions on their behalf, on the basis of what they thought was good for them. This, they said, smacks too much of the colonial history they had experienced for hundreds of years.
The lesson is that climate engineering might reduce temperatures around the world, and this might be a good thing for certain people. But, even if it is, people don’t want to be subjected to economic and technological forces (through climate engineering) that are historically familiar to them and that have led to a great deal of harm, oppression and lack of self-determination.
If climate engineering goes ahead, it must take into account the cultural context of the people the technology would affect. There are going to be ways to do it better and ways to do it worse. If people are going to talk about this technology, then ethicists and others need to remind them of these sorts of problems.
DC You mentioned human character. I would say that it’s our character to move toward more intervention and manipulation rather than allowing things to remain as they are. So as science reveals more about the world, and as we invent new technologies based on that knowledge, won’t the general direction be toward more change and interventions?
CP Of course, and many of these technologies are positive. I wrote in the book that you could call us Homo faber; building is part of who we are. We use our opposable thumbs to build and fix and manufacture solutions to our problems. That’s exciting and rewarding. But I also see people like Henry David Thoreau, who said, “A man is rich in proportion to the number of things which he can afford to let alone.” The idea of restraint and doing less rather than more, to sometimes stand back, is a key part of what environmentalism has been about over the last 40 years or so. We can’t just keep influencing and consuming this world. We must leave some portions of it alone.
I like technology, but I see dangers that sometimes get hidden. A lot of these technologies have wonderful things and horrible things about them. I saw it as my part to show some of the wonderful things and perhaps warn against some of the horrible things. The part of the book where this warning is clearest is the epilogue, where there are a few pages about wildness always inhabiting the world.
DC Yes, at the end of the book is another bear story and another kind of revelation: the horrific death of a hiker killed by a grizzly in Yellowstone is a wake-up call to that uncontrollable aspect of nature.
CP I wanted this message of wildness to resonate with people. Obviously there is wildness in wild animals. But there is also wildness in genomes. There’s wildness in climate, in the machines we build, and in the societies and cultures in which we live. There’s wildness in politics as evidenced in the very fractured, tribal politics that some industrial democracies are experiencing at the moment.
So there’s wildness across the spectrum. I thought the story about the person getting killed was a very visceral illustration of that wildness. I wanted to make it clear that even in what people are calling the Anthropocene, there is still a world out there that is other. It’s different and independent from us.
“That bear was not going to be deterred from eating that person because we claim this is the human age.”
DC In 2009 you wrote a biography of environmental ethicist Holmes Rolston III. A very important aspect of his work is the meshing of a scientific understanding of the world with the idea of a Creator’s influence in building it. Perceiving the world as created rather than accidental can also be orienting to our decisions. Is there a reason that you didn’t reference any of this material in this new book?
CP One of the motifs of the book is that this independent world has been cooking along for millions of years, and out of this, moral value has emerged. It’s a value that we should cherish. This is very much Holmes Rolston’s position. People who read Rolston and then read The Synthetic Age will detect that there is some Rolston embedded in here.
As you say, Rolston inserts some theology into this long and winding story of earth history. But I was a student of his in Colorado, and when he teaches it, he doesn’t include the theology. He believes that you can make this argument for moral value theological if you want to, but that the argument can stand without it. You can just admire the length of the history, the chance events, the creativity and beauty that have emerged.
In The Synthetic Age, that Rolston view of admiration is reflected in my view of what’s at stake and what we might lose. I didn’t mention him, because I was trying to make the book appeal to those who aren’t versed in theology or environmental philosophy. I tried to make it, I guess you would say, as secular a book as it could be, although you can definitely read the types of values I talk about as also being theological.
DC What would you want new parents to know about the world their child is inheriting?
CP This is a world of great beauty, and it will potentially continue to contain great beauty, majesty, surprise, joy and illumination. All the pieces are there and will continue. It’s a world that can give their child an enormous amount of pleasure throughout life.
But if they fall asleep and let all the shots be called by other people, then that world might fade and dim. If climate is controlled by engineers in Miami, or genes in wild organisms are shaped by gene drives and genetic engineering, if ecosystems are put together by wildlife managers making human choices about which species needs to be where, then that magical world will slowly get subsumed under a human world, designed and managed by profit-related interests. That would be a terrible tragedy, I think.
I would say to stay engaged and to really see the value of the world independent of us—and to do what they can do to make sure that that world endures.