Capitalism developed slowly until Adam Smith placed his revolutionary ideas before the world in 1776 in his magnum opus, The Wealth of Nations. Smith’s timing could not have been better, with the American colonies declaring their independence from Great Britain that same year. Democracy and free-market capitalism began their inexorable ascent, as did the politics of individualism.
Within a few decades modern-day capitalism exploded onto the scene as the industrial revolution took hold in Great Britain in the mid-1800s. The pain and suffering associated with this new, raw, unbridled capitalism provided a platform for the radical views of Karl Marx, who predicted that a classless communist society would rise out of capitalism’s self-inflicted destruction.
Capitalism adapted, however, and found a new champion in the United States, bolstered by the immigration of millions seeking a better life in her free-wheeling economy. The unfettered capitalism of the robber barons, monopolists and financiers roared through the 1920s, with men like John D. Rockefeller, Pierre DuPont, J.P. Morgan, Henry Ford and Andrew Carnegie having became household names. Capitalism seemed to have all the answers. Herbert Hoover boldly pronounced during his 1928 presidential campaign, “We shall soon with the help of God be in sight of the day when poverty will be banished from this Nation.”
Hoover’s optimism notwithstanding, the Great Depression set in on October 29, 1929—known since as Black Monday—when the U.S. stock market collapsed. It lost $40 billion of its value (all the gains of the previous two years) in just two months. The Depression quickly spread to most of the world’s industrialized nations and played a major role in world events. Germany, already reeling from the hyperinflation caused in part by its staggering World War I reparations debt of 132 billion marks, fell further, opening the way for Adolf Hitler and the Nazi party. By 1933, U.S. national income had declined by more than 50 percent, and 14 million workers, or one quarter of the nation’s workforce, were unemployed.
A little over a century earlier, Thomas Robert Malthus had predicted the possibility of a “general glut” in the economy, a concept then ridiculed by his colleagues. A glut, or depression, was viewed as impossible, because capitalism was thought to be a naturally self-correcting system. In the 1930s, economists had no answer to a depression and could only sit and wring their hands over this seemingly incurable malaise. Sustained economic decline and high levels of unemployment did not fit classical economic models. Was Marx right after all? Had capitalism contained the seeds of its own destruction all along?
Back From the Brink
Capitalism, at this critical juncture, seemed to act out American writer Mark Twain’s famous statement, “The report of my death was an exaggeration.” Although flat on its back, it was nowhere near death. But it would require a wealthy, eccentric British economist to administer the cure. John Maynard Keynes was already famous in his own right, but the brilliant mathematician and economist revolutionized economic thought with the publication of The General Theory of Employment, Interest and Money (1935–36). Keynes would fundamentally alter the way capitalistic nations viewed and managed their economies. The basic theme of his work was that capitalism had no automatic safety mechanism. The economy, like an elevator, could go to the bottom and stay there.
According to the classical economic model, the business cycle automatically self-corrects: in a slump, labor costs and interest rates decline, encouraging business investments and improving economic prospects. Keynes, however, observed that the economy is defined by the incomes we all earn and by the flow of income from hand to hand. It is this flow that constantly revitalizes the economy, so in the event of a severe disruption in the consumption and business investment balance, trouble inevitably follows. The boom-bust cycle is a natural consequence of economic freedom, observed Keynes, but the sustained slump is a different matter altogether. In a depression, the pool of savings that can be tapped for business investment is squeezed out. Not only are people not saving, they are eating into what savings they have in order to survive.
Enter Keynesian economics. Keynes argued that when business cannot or will not invest, the government must step in and fill the void with public investment. Keynes intended this intervention to be temporary, a sort of “pump priming”; but in Western economies, the concept of the managed economy soon became entrenched in public policy and evolved into socialism and deficit spending. Even today, many governments are grappling with the difficulties of unwinding entrenched social programs that have hampered their economic growth.
“The decadent international but individualistic capitalism, in the hands of which we found ourselves after the war, is not a success. It is not intelligent, it is not beautiful, it is not just, it is not virtuous—and it doesn’t deliver the goods.”
The Depression came to an end with World War II, and the United States assumed a dominant role in international capitalism—one that continues to this day. From 1943 to ’44 the United States and its allies held a series of conferences to create a system for the governing of trade and international payments; to establish the gold exchange standard; and to lay the groundwork for the guardians of the international flow of money, the World Bank and the International Monetary Fund. Robert Heilbroner, in his book The Worldly Philosophers, recounts that after Keynes’s return to England from the final conference in Bretton Woods, New Hampshire, a news reporter asked whether England was now the 49th state, to which Keynes sardonically quipped, “No such luck.”
The Poor Are Always With You
At the same time, a so-called “third” world (because it belonged neither to the prospering capitalistic bloc of nations nor to the communist bloc) was being born through the decolonization process. As the world’s dominant capitalistic economies increased their control over economic production and trade, the Third World served mainly as a source of raw materials and cheap labor.
With the retreat of the colonial powers came increased violence and civil war, particularly in Africa. War within nations has continued to be a scourge. According to The Economist (May 24, 2003), “almost all wars are now civil wars. Many of the causes are economic.” The special report examined a World Bank study and concluded that “the best predictors of civil war are low average incomes, low growth, and a high dependence on exports of primary products such as oil or diamonds.” The article also noted that “the most striking common factor among war-prone countries is their poverty. Rich countries almost never suffer civil war, and middle-income countries rarely. But the poorest one-sixth of humanity endures four-fifths of the world’s civil wars.”
“In each epoch capitalism has been both a factor for unification, even standardization, and a factor for accentuating differences, disparities, and inequalities.”
The unequal distribution of wealth and income is a disquieting characteristic of global capitalism. French historian Michel Beaud believes that current globalization is polarized, unequal and asymmetric, and that it has resulted in deepening “inequality between the well-off, rich, and extremely rich, compared to the poor and very poor.” Thomas L. Friedman, author of The Lexus and the Olive Tree, observes that “the gap between first place and second place grows larger, and the gap between first place and last place becomes staggering.” As a point of reference, Forbes magazine (October 13, 1997) reported that the United States had 170 billionaires in 1997, compared to 13 in 1982. Unsettling similarities can also be drawn with the prosperity preceding the depression of the 1930s, where, according to Heilbroner, income was disproportionately concentrated in the hands of a small percentage of American families, and the average American “had mortgaged himself up to his neck [and] had extended his resources dangerously under the temptation of installment buying.”
As the 20th century drew toward a close, capitalism continued to solidify its grip on the world through globalization, technology, international finance, and the commoditization of society. Global capitalism had become the dominant economic system, casting aside the restrictions enforced by the Soviet Union during the Cold War. With the fall of the Berlin Wall in 1989, the barriers that had prevented the globalization of capitalism disintegrated.
Concurrent with the political upheaval of the late 20th century was an explosion of information, as exemplified by “Moore’s Law” (named after Gordon Moore, cofounder of Intel Corporation, the giant American computer-chip manufacturer). It states that the number of transistors on a chip doubles about every two years. The growth of information technology also illustrates a term made famous by Austrian economist Joseph Schumpeter—“creative destruction,” referring to the perpetual cycle of destroying old, inefficient products or services and replacing them with more efficient ones. Intel perfected the concept of creative destruction through its ongoing technological breakthroughs, making Moore’s Law familiar in everyday life. Intel’s former chief, Andy Grove, encapsulated this phenomenon with his personal motto, “Only the paranoid survive.” It’s a sentiment that seems all too common in the 21st-century world of global capitalism.
The exponential increase in information has gone hand-in-hand with faster computer processing and greater storage capacities. This in turn has led to cost reductions, making information technology affordable to the masses—what Friedman calls the “democratization” of technology and information. Previously disconnected and geographically dispersed people can now communicate and exchange digitized information in a global marketplace through the computer network universally known as the Internet or the World Wide Web.
This web of computers began life as ARPANET, named after the U.S. Department of Defense’s Advanced Research Projects Agency (ARPA). Its aim was to share information between university researchers and government labs. It began inauspiciously on October 29, 1969, when part of the word login was successfully transmitted from a lab at the University of California, Los Angeles, to Stanford Research Institute, then part of Stanford University. (The system crashed before the entire word could be sent.) But with the development of the Internet, everyday life changed beyond recognition. It could be argued that the Internet offers the closest thing to a perfectly competitive market in the world.
In a 1998 interview with Thomas Friedman, John Chambers of Cisco Systems said, “The Internet will change everything. . . . It will promote globalization at an incredible pace. But instead of happening over a hundred years, like the Industrial Revolution, it will happen over seven years.” The next stage will be the Evernet, when we are online all the time.
George Soros, international financier, philanthropist and former hedge-fund manager, observes in his book Open Society: Reforming Global Capitalism that we have entered the phase of the “largely transactional, global society,” in which society replaces relationships with transactions. With the advent of the perfectly competitive digital marketplace, individuals are no longer limited to relationships for information, pricing and purchasing. The transaction and the deal occupy first place in the pursuit of each consumer’s self-interest. However, to make this consumer nirvana possible, providers of goods must constantly seek ways to become more efficient, cut costs and seek cheaper sources of labor.
Soros and others have observed that such global demands have resulted in a global consolidation of business and the emergence of global monopolies and oligopolies. Historian Beaud believes that industrial and financial groups in capitalistic countries have replaced the aristocracy in a hierarchical world-market system.
The digital age has also made the outsourcing of labor possible, taking national boundaries and geography out of the equation and giving multinational corporations the ultimate flexibility in the allocation of capital.
Another relatively recent manifestation of capitalism’s ability to mutate is the emergence of international financial capital market-makers and hedge funds, or what Friedman labels “the Electronic Herd.” This group, loose as it may be, can adjust flows of financial capital at blinding speed, rewarding countries and companies that comply with its demands, and penalizing those that do not. Soros likens this phenomenon to “a gigantic circulatory system sucking capital into the center and pushing it out to the periphery.” The center is the provider of capital, and the periphery is the recipient(s).
Soros’s ideas about the international flow of capital accords with the position of a number of authors who contend that capitalism and democracy are not inevitably linked. While some democratic economies may underperform totalitarian counterparts, it is clear that countries that encourage free markets and reward risk-taking will generally outperform those that do not. Historically this has been more apparent in democratic countries, but international financial capital will naturally flow to any country that provides profit opportunities and market growth. This is illustrated by the astounding growth of communist China as an emerging market. Assuming that the communist leadership continues its free-market policies, capitalism will flourish there.
The current era of global capitalism thus seems to sum up Schumpeter’s idea of creative destruction, with its continual progression of innovation, destroying the past while preparing itself for its own future. But where is it leading? In our globalized world, what can we look forward to—not just for our children and grandchildren, but for our own generation?
Read the conclusion of our analysis of capitalism in the next issue.