Feeling Our Way Through Facts

Early 20th-century French philosopher and mathematician Henri Poincaré once noted that “to doubt everything or to believe everything are two equally convenient solutions; both dispense with the necessity of reflection.” A hundred years later, reflection—thinking about the information we accept or reject—is more important than ever.

Nervous States: Democracy and the Decline of Reason

William Davies. 2019. W.W. Norton, New York. 272 pages.

Unsavory Truth: How Food Companies Skew the Science of What We Eat

Marion Nestle. 2018. Hachette Book Group, Basic Books, New York. 320 pages.

Thinking Critically About Research: A Step-by-Step Approach

Jane Ogden. 2019. Taylor & Francis Group, Routledge, New York. 176 pages.

Many people who study to become lawyers are soon disabused of the notion that their chosen career is about the discovery of the truth. It turns out to be more about persuasive argumentation. Others, who study philosophy, learn the difficulty of even defining truth.

Today we live in an age of subjective truth, where “your truth isn’t necessarily my truth.” For the average person, it’s bewildering. For the expert, it’s the path to cynicism. How can we deal with the issues of our time without falling into either ditch? Is there any secure foundation for truth?

Take immigration, climate change or vaccinations. Trying to find the truth in today’s big news stories is like finding the way through a maze. “Fake news” and “alternative facts” make headlines, as do research results that are announced to the world and then found to be invalid or even fraudulent. Because of the glut of information at our fingertips, it can be difficult to determine whom to trust.

Feelings and facts are often given equal weight in interpreting the data we’re constantly bombarded with. As a result, we can struggle to learn what may actually be true. How did we get to the point where reason takes a back seat to emotion?

Three recent books discuss our often uneasy relationship with facts and offer suggestions on what we can do.

Rise and Fall of the Expert

William Davies, a political economist at Goldsmiths, University of London, explores the role of experts in shaping modern society and the rise of emotions in the modern public arena in his book Nervous States.

The title is a double entendre, referring both to today’s overwrought political climate in nations around the world and to the effects of media, technology and constant conflict on our minds (and even our bodies), pushing them into states of chronic uncertainty and acute alertness.

Davies interweaves the role of the expert over the last four hundred years with themes of politics, economics and medicine. While it is sometimes difficult to connect the dots in his leaps through time and discipline (and as a result, some of his facts seem stretched to fit the argument), he makes an intriguing case that in a society that increasingly distrusts science and authority, emotions—specifically the emotions of “fear, pain and resentment”—have won out over facts.

During the scientific revolution, French philosopher René Descartes came to think of the mind and body as separate, and the reasoning processes of the mind as superior to the reactions of the body. Descartes, writes Davies, held that the mind was “an observatory, through which the physical world—from which it is separate and different—can be inspected, criticized, and finally replicated in the form of scientific models that can be committed to paper and shared.”

Meanwhile Thomas Hobbes, tormented by the violence of protracted wars in both Europe and his native England, and having fled to France after challenging the political authority of the church at home, “believed that philosophy and science ought to provide a basis of peaceful consensus.”

When the Royal Society formed in 1660, Davies explains, its members agreed on the value of documenting their investigations and activities: “All deliberations would be noted and saved as records, which would be open to other scientific communities across Europe.” These gentleman scientists would deal in facts, which “are all always implicitly a type of pledge, namely ‘I promise you that this happened.’ Facts possess many of the same qualities as banknotes, circulating freely among strangers without their integrity being doubted, and cementing trust in the process.”

Society members also agreed to keep their personal feelings and egos separate from their experiments and reports, not celebrating their own contributions but treating each other “with the utmost respect.”

This emphasis on objectivity, transparency and legitimacy was “viewed as the facilitator of progress, allowing one finding to be added to another.”

That the establishment of experts resulted in progress is undeniable. In both the public and private sector, it’s hard to think of any facet of modern life that hasn’t come about as a result of the contributions of experts. In the Western world, certainly, we live longer, healthier, wealthier, more peaceful lives because of their work.

What has happened, then, to turn the public’s opinion? Whereas the US edition of the book carries the less-than-punchy subtitle Democracy and the Decline of Reason, the UK edition, published in 2018, is more specific and seems to promise some answers: How Feeling Took Over the World. As we noted earlier, this isn’t just about politics; the problem is multifaceted. Showing, for example, that the field of medicine is affected by society’s current emphasis on feeling, whether in terms of physical sensations or emotions, Davies asks: “What is the role of the expert, where feelings themselves are the problem?” Our perception of how we feel, and the desire to avoid discomfort, seems to have become more important to us than the underlying illness; the current focus is more on the management of pain than on the treatment of disease.

Davies suggests that, especially in dealing with big-picture issues such as economics or international conflict, ivory-tower experts are less valuable than “people who can solve problems and respond to circumstances.” Problem solvers—data analysts, entrepreneurs, trend-spotters—all are seen as more credible than elites who deal in inflexible facts. What matters now is speed of information and flexibility of response.

High-profile cases of fraudulent studies and scientific misconduct, along with accusations casting experts as a “privileged ‘elite’” who “instruct the rest of us what to believe” without understanding the effect of their policies, have eroded trust. While this is not new (Hobbes himself distrusted the Royal Society for its closed membership), the advent of an omnipresent media and instant connection to the world for anyone with a smartphone has increased the reach of those who claim their facts are as valid as the experts’.

The distance between the statistics and facts of the expert and the daily lives they claim to represent has stretched to the breaking point. Experts, instead of being trusted and respected, are ignored or denigrated by politicians and doubted by a public who often turn to their own Internet searches for information they feel they can trust.

Because people can only really know their own thoughts and how they arrived at them, they tend inevitably to place greater value on their own idea of truth than on anybody else’s.”

William Davies, Nervous States

Truth and Trust

It gets more complicated, however. We may well look to experts for help, but as Davies shows, not all of today’s “experts” are, in fact, truthful or trustworthy. In addition, it’s nearly impossible to remain on top of today’s constant influx of new information; as much as they would like to, not all professionals can keep up. So it’s good to ask the professionals, but it’s also good to investigate the research yourself to see what may be specific to your own situation. And it’s vital to understand that research experts can (and do) skew science to influence consumers. Some of them don’t even realize they’re doing it.

Marion Nestle reports on this phenomenon in her book Unsavory Truth: How Food Companies Skew the Science of What We Eat. A respected author and emerita professor of nutrition, food studies and public health at New York University, she refers to studies that show how something as seemingly trivial as a gift of a pen and a pad of paper with a company logo on them—not to mention dinners and vacations sponsored by pharmaceutical companies—can influence a doctor’s prescription decisions. Doctors may not be aware that they’re being influenced, just as paid researchers may not realize that their research or study projects could be biased from the beginning through their very design.

Nestle sheds much-needed light on the topic of industry-funded research, which becomes especially shady when the backing industry is well disguised. She devotes an entire chapter to the Coca-Cola Company’s channeling of multiple millions of dollars in funding through institutes such as the International Life Sciences Institute as well as the now defunct Beverage Institute for Health and Wellness and Global Energy Balance Network.

Callouts by journalists and advocates resulted in the release of material showing the extent of Coca-Cola’s influence on these institutes and their research—in particular, “The International Study of Childhood Obesity, Lifestyle and the Environment (ISCOLE).” The Coca-Cola-backed study concluded, in Nestle’s words, “that the most important correlates of obesity in children were low physical activity, short sleep duration, and frequent television viewing.”

Despite what common sense would tell us, the study found no correlation between obesity and intake of sodas, soft drinks or other sugars. Why not? Because the researchers didn’t look for one. Of course, exercise is important. But shifting obesity’s blame from poor dietary choices to inactivity was conveniently beneficial for the study’s sponsor, who happens to manufacture and sell the most popular sugary soft drink in the world.

Nestle refers to disclosure statements and e-mails (obtained by The BMJ, the Associated Press and others) that shed light on this study: “An analysis of the emails makes clear . . . [that] the researchers consulted with Coca-Cola in making strategic decisions about study design.” Funding may have been unrestricted, she explains, but it seems the study itself was skewed from the beginning.

I doubt that food companies would willingly contribute to a fund they could not control, especially if their money happened to support studies that produced inconvenient results.”

Marion Nestle, Unsavory Truth

Although Nestle concedes that Coca-Cola has since been making strides toward greater transparency in such matters, she makes the point that any industry-funded research must always be viewed very carefully, with eyes wide open. “Food, beverage, and supplement companies are happy to fund research with a high probability of supporting marketing objectives,” she writes in her blog, Food Politics. “Industry-funded research almost invariably comes out with results favorable to the sponsor’s commercial interests. It’s unreasonable to expect otherwise. Food companies are not public health agencies; they are businesses expected to generate profits and returns to shareholders—that is their #1 priority.”

(Interestingly, as we post this Vision review acknowledging Coca-Cola’s declarations of improved transparency, a related study has just been published in the Journal of Public Health Policy. It indicates that even now the company retains significant control over the research it funds: “Although it does not have the capacity to direct and control the day-to-day conduct of studies, Coca-Cola retains varied rights throughout the research process, including the power to terminate studies early without giving reasons.”)

Pursuant to the same goal of generating profits and dividends, corporations and organizations may fund studies that put forward new “superfoods.” We have learned that blueberries may be the answer to the most personal of male problems, while pomegranate juice may be the elixir of life. Some of the many claims, backed by “research,” seem too good to be true; and often, such foods may indeed be healthful as part of a balanced diet, but they are not likely to be a magic bullet.

Critical Thinking

It’s important to remember in this post-truth world, where facts are no more important than feelings, that not all available information is equally valid. Critical thinking in the face of persuasive “evidence” is vital to understanding and determining which guidance and indeed which people are worth following.

Jane Ogden literally wrote the book on this topic: Thinking Critically About Research: A Step-By-Step Approach. A professor at the University of Surrey, she says the idea for the book grew from her years of teaching “health psychology, research methods, and critical thinking.” Stating that the book “should be essential reading” for students and lecturers as well as for all who use or report on research in their work, she suggests that a weary (and wary) public would also benefit from a set of practical tips to help navigate a ubiquitous and extremely challenging problem.

Whether we are deciding what to eat, how to stay well, whether to go to the doctor, . . . we need some understanding of the research out there and some assessment of what we should believe and what we should ignore.”

Jane Ogden, Thinking Critically About Research

Ogden isn’t wrong about the public needing instruction, and this is a valuable resource for anyone wanting to learn to think more critically, but the reader should be aware that this is not an easy weekend read. Filled with research examples and mental tasks, the book provides the reader with a master class in the process of thinking critically about research.

Four chapters are devoted to the first step: knowing the methods of research. Ogden first addresses “the basics”—how researchers establish the questions to be answered and determine their sample and the variables. She follows with chapters on the importance of knowing how studies are designed, how measurements are taken and how the data are analyzed.

Step two follows with another five chapters showing how to think critically about the methods and how to be aware of possible pitfalls, such as researcher bias or the overgeneralization of results.

The third step is assessing how evidence is presented.

In step four, Ogden presents her “critical toolkit,” where evidence is actually evaluated using all the strategies gathered in the first three steps.

The final step of the book goes beyond the research at hand to address bigger questions: Does the finding have unintended consequences? and Why do we ask the questions we ask? She names this the “extra critical” step and calls it “the icing on the cake in terms of thinking critically about research.”

Still, even after applying all these steps, Ogden admits that there isn’t always a concrete conclusion. “All studies are flawed in some way,” she warns, “whether it be due to their sample, design, measures, or analysis. All research findings need to be understood in the context of these flaws.”

Dealing With Uncertainty

Ogden sums up the challenge: “It is clear that no evidence is ever perfect and that thinking critically also involves recognising and dealing with uncertainty and making conclusions based upon the best-case evidence at any point in time.”

On this point Davies agrees with her, quoting science-policy thinkers who say we’ve entered a phase of “‘post-normal science,’ in which ‘facts are uncertain, values in dispute, stakes high and decisions urgent.’” As a result, there’s no going back to a time when facts alone were enough: “The categorical division between ‘reason’ and ‘feeling’ no longer functions.”

Like Ogden, Davies offers suggestions for navigating these uncertain times. As a society, he says, we need “to feel our way toward less paranoid means of connecting with one another.” While acknowledging the importance of facts, he says we should stop denigrating “the influence of feelings in society” and see what we can learn if we listen to them.

To do this, we need to stop sacrificing care for speed: “Perhaps the great virtue of the scientific method is not that it is smart (which is now an attribute of phones, cities and fridges) but that it is slow and careful. Maybe it is not more intelligence that we need right now, but less speed and more care, both in our thinking and our feeling.”

Postscript

Helping locate a firm foundation of truth to guide our steps is an important and rare contribution Vision can make to the discussion. Are there any absolute truths? We’ve lived for almost a century with the idea that truth is relative, that there are no absolutes.

Some have blamed Einstein for that, but as he noted, “The meaning of relativity has been widely misunderstood. Philosophers play with the word, like a child with a doll. . . . It does not mean that everything in life is relative.”

He showed that certain moral precepts are necessary to guide us along the path of life. In his case, the success of scientific enquiry depended on others not lying. He was willing to defend the absolute command not to lie. He said, “We do not feel at all that it is meaningless to ask such questions as: ‘Why should we not lie?’ We feel that such questions are meaningful because in all discussions of this kind some ethical premises are tacitly taken for granted.”

Einstein cited only one of the 10 moral imperatives that are biblical in origin. They lead to many other considerations about the role of truth in any healthy society.