How do you decide what is right and what is wrong?
Morality is a code of conduct adopted by a society. But who decides what it is? In previous centuries people generally agreed on a common idea of morality, but times have changed. The hypocrisy of organized religion and the secularization of society have left us adrift and without a shared notion of right and wrong, and society suffers for it. Has the time come for us to reconsider a shared understanding of right and wrong based on timeless, universal and proven principles?
To establish a sense of where our society is heading (based on the prevailing worldview of the up-and-coming generation), sociologists Christian Smith and Lisa Pearce are leading a team of researchers in the ongoing National Study of Youth and Religion. The team has been tracking the religious and spiritual development of a group of young people since 2001, having followed their paths and interviewed them on three separate occasions between 2001 and 2008. The third wave of interviews, in 2008, focused on 230 18- to 23-year-old “emerging adults” who represented “every region, social class, race, ethnicity, religion, educational situation, and family background” in the United States. The results are published in Lost in Transition: The Dark Side of Emerging Adulthood. Interview questions were wide-ranging but included matters of morality, moral beliefs and moral reasoning.
In the book’s first chapter, titled “Morally Adrift,” the authors remark that they were struck by “how strongly individualistic most of them are when it comes to morality”; the general consensus among young people today “is not to judge anyone else on moral matters, since they are entitled to their own opinions, and not to let oneself be judged by anyone else.” Nearly half agreed that “morals are relative, there are not definite rights and wrongs for everybody.” To these young people, morality is “nothing more than subjective personal opinion or cultural consensus. . . . Morality is purely a social construction.”
“I mean for me I guess what makes something right is how I feel about it, but different people feel different ways, so I couldn’t speak on behalf of anyone else as to what’s right and what’s wrong.”
The researchers conclude that society in general is letting our young adults down in the area of moral training: “American emerging adults are, a people deprived, a generation that has been failed, when it comes to moral information.” The first chapter closes with a statement and a query that are well worth considering by society as a whole. These youths, the authors note, “need some better moral maps and better-equipped guides to show them the way around. The question is, do those maps and guides exist, and can they be put into use?”
Young people today aren’t the first to take an individualistic view of morality. They are following a long tradition that can be traced to the earliest records of human reason. From at least the time of Greek philosopher Socrates, history records the ideas of people who believed in an innate human ability to determine for themselves what is right.
Then came the Dark Ages—a period marked by the suppression of knowledge and the individual by the powerful Christian church. But by the 14th century, in direct response to such suppression, early Italian humanists including Petrarch and Giovanni Boccaccio sought freedom of intellectual inquiry to determine the course of life. History professor Steven Kreis, in his extensive online History Guide, points out that “the period from the 14th century to the 17th worked in favor of the general emancipation of the individual.” He cites Dante, Petrarch, Machiavelli and Montaigne as champions of “the virtues of intellectual freedom and individual expression,” adding that theirs was a time when “individualism and the instinct of curiosity were vigorously cultivated. Honest doubt began to replace unreasoning faith.” This “spirit of individualism,” writes Kreis, contributed to the Protestant Reformation, “which, in theory at least, embodied a thorough application of the principle of individualism in religion.” He summarizes, “It was during the humanist era that the freedom of individual expression and opposition to authority was first brought to the surface and became an integral part of the western intellectual tradition.”
This Renaissance period, implying the rebirth of classical intellectual ideals, spawned the Age of Reason, which developed further into the 18th-century Age of Enlightenment. The trail winds on down to our day under banner headings such as modernism and postmodernism. However, the underlying tenet of humanism flows through them all.
The American Humanist Association defines humanism as “a progressive philosophy of life that, without supernaturalism, affirms our ability and responsibility to lead ethical lives of personal fulfillment that aspire to the greater good of humanity.” It points out that “humanists ground values in human welfare shaped by human circumstances, interests, and concerns.” Inherent in the movement is a commitment “to treating each person as having inherent worth and dignity, and to making informed choices in a context of freedom consonant with responsibility.”
Although it wasn’t always so, humanists today are often secularists as well. The AHA’s slogan, for example, is “Good without a God.” In secular humanism, the emancipation of the individual brings with it the freedom for individual expression apart from the restrictions of religious morality. Secularism is broadly defined as the view that neither religion nor religious considerations have a place in society—whether in politics, economics, ethics or moral judgment. Even those who claim to be Christian humanists or religious humanists are essentially secular. Their aim, says the AHA, is human self-fulfillment but within a religious framework: “This more human-oriented faith is largely a product of the Renaissance and is a part of what made up Renaissance humanism.”
It would be no overstatement to say that humanism today is the pervasive worldview. So it is little wonder that the youth of today face moral uncertainty. They are growing into adulthood in a thoroughly secularized world. Not only are their cultural and educational environments secular, but in many cases their family’s religion and their associated practices are also quite secular. Young people are, in large part, provided with precious few moral guidelines on which to base decisions. Rather, they are encouraged to decide for themselves what is moral—what is right and what is wrong—and to use their own judgment based on little more than what feels right at the time.
To Each His Own
As one might expect, Christian Smith’s report has met with mixed reactions. Some see the decline in moral standards as boding ill for the future of our societies. Sociologist James Davison Hunter is among them. “Of good intentions there is no end,” he writes in The Death of Character. “The commitment to do well by our children is serious and unflagging. In the end, however, while we desperately want the flower of morality to bloom and multiply, we have, at the same time, pulled the plant up out from the soil that sustains it. We so urgently desire the cultivation of moral qualities, but under conditions (we insist upon) that finally render those qualities unattainable.”
“This, alas, is the bind we are in: we want the flower of moral seriousness to blossom, but we have pulled the plant up by its roots.”
Others see no problem with the development of moral individualism, in fact viewing it as laudable progress, something to be aspired to in a natural moral progression. They see the growth of secularism as a positive response to the perceived damage done by traditional external sources of morality.
As with humanism itself, this line of reasoning can be traced back to such people as Jeremy Bentham (1748–1832) and the moral philosophy known as utilitarianism. Political philosopher Michael J. Sandel of Harvard synthesizes utilitarianism: “Its main idea is simply stated and intuitively appealing: The highest principle of morality is to maximize happiness, the overall balance of pleasure over pain.”
After Bentham came John Stuart Mill (1806–1873), who took Bentham’s work a little farther. He wrote in his 1859 treatise On Liberty: “The only part of the conduct of anyone for which he is amenable [or accountable] to society is that which concerns others. In the part which merely concerns himself, his independence is, of right, absolute. Over himself, over his own body and mind, the individual is sovereign.” This sounds remarkably similar to positions held by many of the emerging adults in Smith’s study on youth and religion.
Morality has been a moving target for a very long time. Where we are today has therefore been many years in the making. Smith points out the obvious: emerging adults “are simply mirroring back to the older adult world, to mainstream society and culture, what has been modeled for them and what they have been taught.” He describes them as “good learners” who are now “eager to enjoy the benefits of their material abundance and consumer choice,” and he lays the real problem at the feet of “mainstream American culture and institutions.” As mainstream America increasingly embraces secularism, it is moving away from the anchor points that have traditionally provided some moral basis to life. Secularism embraces philosophers like Bentham and Mill and provides a continuing engine to drive moral individualism.
A Moral Base
Why is absolute morality now so rejected? First, it’s because there is no agreement as to what the absolutes are. Second, we do not want any authority telling us how we should live. It is admittedly a tough sell when many institutions that proclaim absolute morality can be shown to be morally bankrupt themselves. How does an institution declare absolute standards of morality when those entrusted with authority are publicly exposed as violators of the very standards they proclaim? As Smith remarks, “the idea of an ‘absolute morality’ is fraught with ambiguity and so is difficult to handle. This, we think, trips up many emerging adults and sends them sprawling toward relativism.”
So rather than just declare that we should live by an absolute standard of morality, why not undergo a thorough examination to substantiate whether the standards are appropriate? This surely will mean looking for something outside the self, since current standards, as set from within, are clearly not producing harmony.
Sandel offers this thought for consideration: “A more robust public engagement with our moral disagreements could provide a stronger, not a weaker, basis for mutual respect. Rather than avoid the moral and religious convictions that our fellow citizens bring to public life, we should attend to them more directly—sometimes by challenging and contesting them, sometimes by listening to and learning from them. . . . A politics of moral engagement is not only a more inspiring ideal than a politics of avoidance. It is also a more promising basis for a just society.”
Any public discourse should take into account the existing moral condition of our society; current widespread convictions—religious or otherwise—should be challenged and contested in terms of whether people are living peaceful, happy and fulfilling lives under moral individualism. Intensive research such as that supplied by Smith and his coworkers needs to be carefully assessed. If moral individualism is moral progress, why are families and marriages failing at unprecedented rates? Why are so many young people wandering aimlessly through life and suffering depression? Why is widespread crime an ongoing problem?
While we’re asking questions, why not inject into this equation an open-minded look at principles that have been given to all humankind—principles that were given for our good? The Creator of this universe and earth is absolute, as is His law, which is summarized in two simple concepts: “You shall love the Lord your God with all your heart, with all your soul, and with all your mind. . . . [And] you shall love your neighbor as yourself.” All other biblical laws governing human behavior stem from those two ideals.
In spite of how people have tried to portray God through the development of man-made religions and concepts, He remains who He is: the revealer of truth and of the way we can live to fulfill our human potential. We are told that “the counsel of the Lord stands forever” (Psalm 33:11). This means that God has an overall plan for humanity that is sure. Within that plan He has provided a moral basis for human life so that we can be happy. If we are not, then maybe we are not seeing or understanding the moral foundation God has clearly set forth and revealed in the pages of the Bible. King Solomon of ancient Israel, reputed to be the wisest man who had ever lived, proclaimed, “Where there is no revelation, the people cast off restraint; but happy is he who keeps the law”—that is, the perfect law of God (Proverbs 29:18).
In response to the question posed by Smith (Do maps and guides exist?), the answer is yes. A clear sense of right and wrong is available to us if we have the courage to put aside the accumulated wrong-headed teaching of humanism and secularism and open our minds to the Word of God.