cailin oconnorFrom a simple “like” on social media to a damning headline of false facts on national news, contributors – unknowing and otherwise - to the rise of fake news can be found everywhere from a high school desk to the bullpens at some of the most reputable news agencies. In their forthcoming book (Jan 9), The Misinformation Age: How False Beliefs Spread, UCI logic and philosophy of science professors Cailin O’Connor and James Weatherall explain how fake news comes to be, who some of the biggest culprits are, and how we can all work to kill it before it grows legs.

jim weatherallA lot of people have argued that persistent false beliefs are best explained by individual failures -- cognitive biases, blindspots, things like that. But is that the only or most important explanation of why we see so many well-meaning and well-informed people holding false beliefs?

CO: This is one of the core points of the book. Most people, including academics, who have thought about false beliefs and fake news assume that the main problem has to do with our psychological biases. We accept new information that fits our current beliefs, things like that. But we want to push back on this idea. We think that to really understand why false beliefs can persist and even spread, you need to recognize that there is a deep social aspect to what we believe. Think about where virtually all of our beliefs, true and false, come from: someone told you something. Almost everything you believe you get from others. So now think about social media and how people’s social interactions influence the way they get info. Who are they trying to impress? And now think about fake news and propaganda from governments and industry. Fake news works because propagandists know how to take advantage of social ties and connections to promote the beliefs they want people to hold.

What are some examples you explore in your book to look at the spread of false beliefs – intentional or not?

CO: We look at a lot of examples, many of which involve some group, such as an industrial organization, that is trying to shape public opinion on something that affects their interests. We study the tobacco industry’s efforts to stop cigarette regulation in the face of a huge body of evidence that smoking kills. But we also look at other examples, where it is really not clear who stands to benefit from people believing the wrong thing. One big case we look at is the anti-vaccine movement. This one happened not because of big industry, but just because of the way social effects influence what people end up believing. One issue here has to do with the role of celebrities, who tend to have very large social networks and whom many other people trust and admire. The anti-vaccine movement really got going when high-profile, influential people gave a platform and implicit endorsement to people with unsupported views about vaccines. Think, for instance, of Jenny McCarthy, who appeared on Oprah’s show to talk about an alleged link between vaccines and autism. That got a lot of attention for the anti-vaccine movement and convinced a lot of people to take it seriously.

CO: The other issue is that people in the same neighborhood interact, they share information with one another, they imitate one another, and they often trust one another. This means that when someone gets a wild idea, and pushes it hard on others in their community, they can be very influential. In fact, we found clear cases where anti-vaccine groups targeted close-knit communities, such as in in Minnesota in neighborhoods with predominantly Somali immigrant communities with high rates of severe autism. Anti-vaccine groups targeted community centers there, going door to door and putting out flyers. They managed to get this entire in-group talking about vaccines as a problem, and their vaccine rate went from somewhere in the 90% coverage rate to more like 40%. They then had a major measles outbreak. This is a prime where the anti-vaccine interest group made use of social ties to spread false beliefs.

JW: It’s important to look at motivations behind the spread of false beliefs. In the vaccine movement, the Minnesota group was targeted because of the high level of autism in their community. The people behind the anti-vaccine movement really do think kids get autism from vaccines. This one wasn’t a case where some clear economic interest was trying to influence beliefs, like in the Big Tobacco case. The anti-vaccine activists really believed in what they were saying.

We have seen a startling amount of polarization in the U.S. in recent years, concerning not only opinions and values, but facts themselves. What explains this and what can we do about it?

CO: One example of this that we look at in the book is the chronic Lyme disease debate. Everyone agrees that Lyme disease afflicts a huge number of people every year. But it’s very controversial whether Lyme disease is always cured by a dose of antibiotics, or if in some cases there is a chronic form of the disease that can recur long after it has been treated. Lots of people are trying to figure out what’s happening, and there are two groups that have emerged with very strong, different beliefs. They can’t both be right, and they both produce a lot of studies. But they don’t seem to influence one another. The issue, we argue, is trust. Basically, each group only trusts evidence coming from those who share their beliefs. If you have similar beliefs to someone, you trust them. Once you have that sort of situation, you end up with people in very different camps only listening to the people who are like them - even when everybody involved wants to figure out the truth, like in the case of chronic Lyme disease.

JW: One really important aspect of what Cailin just said is that we are not assuming that people in the Lyme disease case are only listening to evidence that supports their current beliefs, though of course this might also be happening. Instead, we argue that they are only listening to evidence from people who believe what they believe, and that, because of the sorts of questions those people end up asking, they only get exposed to a limited amount of evidence. Basically, if you think it is settled that chronic Lyme disease doesn’t exist, then you aren’t going to keep doing studies that try to prove it does exist, and you’re going to think that the sorts of people who do do those sorts of studies probably have a screw loose somewhere.

JW: Another thing we argue in the book is that there are actually several ways in which you can explain polarization, all of which could be right. But different explanations of polarization suggest different solutions, and in some cases these solutions oppose one another. As Cailin just explained, trust can play a role. But another explanation of polarization is that people are only exposed to a limited range of evidence. In that case, the natural solution is to expose people to more perspectives. But the trust explanation conflicts with that. If you suddenly put two groups with very different beliefs in contact with one another, and they don’t trust each other, they can end up even more polarized. So if trust is the basic issue, what you need is people who are recognized as trustworthy by the community to bring people along.

How do industrial and political forces use real scientific results to distort politicians' and others' beliefs about where the preponderance of evidence lies? How does propaganda affect science, both for scientists and the public? How can we combat propaganda about science?

CO: One of our big book themes is that it is really easy to make people believe the wrong thing, using real science and without violating any basic norms about fraud or corruption. The key is to come up with ways of shaping the overall body of evidence the people see, either by promoting scientists who you think are likely to produce industry-friendly results, or else by promoting studies and research articles that are outliers and happen to support your views. Another big technique is to promote research on other questions that you can use to confuse the real issues. For instance, the tobacco industry spent a ton of money funding research on asbestos, because it was another way to explain lung cancer.

JW: These methods are very subtle, and they often involve doing things that we think of as harmless. Even very sophisticated people can get it wrong. Nature magazine recently published an article arguing against the idea that the sugar industry had manipulated science to make fat look more dangerous. What the article said was that all the sugar industry did was fund researchers who already believed fat as dangerous, so industry didn’t corrupt anyone. But this is a great example of something that philosophers Bennett Holman and Justin Bruner have called industrial selection. By funding these scientists, the industry promoted their ideas and got more research that they liked, shifting the total body of evidence. It wasn’t fraudulent, but that doesn’t mean that it didn’t ultimately lead people to believe the wrong thing.

How do journalists contribute to misinformation and false beliefs? In what ways do standard journalistic practices inadvertently mimic propaganda?

CO: Until the ‘80s, there used to be a law on the books, called the fairness doctrine, which required journalists to present both sides of an argument equally. But on scientific matters of fact, that creates a problem, because there is usually more evidence in favor of true things than false ones, which means that presenting both of them evenly means giving too much misleading evidence. So when journalists try to be even-handed, the public ends up seeing too much of the wrong evidence. It’s inherently misleading.

JW: As we said before, an extremely effective propaganda tool is to try to distort the total evidence that the public sees. Essentially, you take scientific evidence and expose people to just some parts of it. This isn’t good. And treating scientific issues using something like the fairness doctrine mimics this sort of propaganda, by distorting the total body of evidence.

CO: This is how journalists, in trying to do due diligence, unwittingly spread false ideas.

JW: We can see this sort of effect elsewhere in the media, too. Just a few weeks ago, NPR did a long segment in the aftermath of the Pittsburgh shooting on the ideology of the murderer. They dealt with in a sensitive way, bringing on someone who wrote a book on the history of the white nationalist group Stormfront. But what it ended up being was a 45 minute segment with a whole bunch of ideas people who listen to the show have never been exposed to. It started with legitimate reasons to be interested in how something like this can happen, but it became a way of exposing lots of people to this ideology based on a whole lot of false assumptions and bad reasoning.

CO: Take Pizzagate. Large media organizations were talking about how people couldn’t believe this little sub-reddit community gets airtime. Some portion of the people seeing this think they should go look more into it. Fast forward to the shooting

JW: There’s a simplistic idea about what science and a scientific study are that leads to problems. The fact is that any experiment can be wrong. There’s randomness in the world. For example, look at smoking and cancer. Take 100 people and watch what happens to them. It’s never a guarantee that the smoker will get cancer, it elevates the possibility they will. It doesn’t mean cigarettes won’t cause cancer; it’s just one data point. The way you get scientific understanding is that you look at lots of data points, lots of studies. But Journalists often report on individual studies or the newest findings, as if any one study or experiment is the whole story. And this is completely wrong. But it’s very difficult to communicate the level of uncertainty with these findings. Basically, focusing on single studies amounts to cherry-picking data, which is just what propagandists do. This is another way that journalists can mimic propaganda.

CO: He’s right; outlier results are a hook for journalists. To be fair, it’s hard for journalists to know how to evaluate individual scientific outputs, and their job incentives are not to inform the public to the greatest degree. Their incentives are to get clicks, sales, to pitch and write a story in the most interesting way – not necessarily to make sure people know the truth in the most effective way.

Legislators and other policy makers need to know the facts to make good policy. Science is an important source of knowledge of facts about the world. But scientists are human beings, often with their own political and ethical values. How do those values influence scientific outcomes? Are scientists just political actors trying to get more funding, or worse, trying to use their authority to bring about policies they support for other reasons?

JW: One thing that you learn when studying the history of science, is that scientists are human beings. They operate in a cultural context. They care about things and have their own interests and sometimes biases. But we have to be very careful about what that means and what we can conclude from that. One important point is that the mere fact that scientists are human doesn’t meant that established scientific results are shaped by individual scientists’ biases. Established science results from a long process, where scientists criticize and review each other’s work and ultimately come to an agreement. It is more than just the results of just any individual scientist. It is the scientific community as a whole that does the work.

JW: Still, something this goes wrong, for instance when the whole community shares some assumptions or biases. But even this shouldn’t mean we ignore science. One of the big points of our book is to argue that when we’re making decisions, what you should really care about is the relationship between your beliefs and the available evidence. Scientists have training, access to information, and the cultivated ability to evaluate evidence. This puts them in a better position than most people to evaluate evidence and draw reliable conclusions. So the fact that they can be wrong or can be influenced isn’t the right thing to focus on. The right thing to focus on is what the evidential basis says and quality of the argument. Despite acknowledging all of the ways in which scientists can be wrong, we should still look to scientists to be our most reliable guide for policy – they are trained to be better at evaluating and collecting evidence that bears on the decisions we want to make.

CO: People often think that because science is sometimes flawed or biased, we shouldn’t trust it. Science can be flawed, but the evidence scientists produce says something about the world. The real problem is that you have outside actors trying to subvert beliefs, like in the examples of tobacco and sugar we mentioned before.

Scientists sometimes disagree. How does this happen and what does it mean? What can policy makers and ordinary voters take away from cases where scientists appear to differ about what the evidence says?

CO: As we said above, scientific evidence is often equivocal and there is a lot of uncertainty. Not everyone who smokes will get cancer and die, and some people who take vaccines will have autism, for other reasons. Science never tells you anything certainly, and it can take a long time and a lot of mistakes before scientists come to agree on anything or believe it with high confidence. The way scientists sort out their own mistakes is by criticizing one another. This all means that debate and disagreement is just a normal part of the scientific process. The thing that’s worrying is that sometimes, even after scientists have come to a consensus, other groups can take this process of disagreement and manufacture an illusion of doubt. This is the case with climate change and tobacco. There is a scientific consensus. But industry pushes an illusion of manufactured doubt, partly by emphasizing that scientists can never be certain about anything, and partly by pointing to individual studies that seem to suggest the wrong answer, even though most other studies disagree with them.

JW: It’s really important to understand that the jury can still be out on something. The evidence may not be clear and scientists may still be studying it. For example, chronic Lyme disease. I think you have to distinguish those cases from ones where you have some outside interests who are trying to create an impression of uncertainty. These can be hard to identify. Two well-credentialed scientists saying two very different things can be hard to make sense of. How can you tell the difference between legitimate uncertainty and manufactured uncertainty? One key is that we have very important institutions, such as the National Academies, or the Intergovernmental Panel on Climate Change (IPCC), that play a role in reviewing bodies of evidence. When you have an individual scientist whose views don’t align with the reports of these sorts of institutions, or whose arguments are not taken seriously in their reports, that’s a huge red flag. If these scientists really have strong arguments, there’s a process for evaluating them and getting them out to other scientists, which is through peer reviewed journals. Research that gets through that process is part of the record, and it is accounted for by other scientists. But sometimes these scientists do something else, which amounts to asking journalists to review their work instead of other scientists. This usually means the arguments aren’t strong enough to get past peer review, and you should be wary.

CO: I think this is right, but you have to be very careful. Another theme in the book is an idea introduced by our colleague Bennett Holman, which is that whatever scientists or journalists do to protect public beliefs, there is always a way for industrial groups and others to get around it. For example, every time the IPCC puts out a report, the Non Intergovernmental Panel on Climate Change puts out its own report, which comes to exactly the opposite findings on the same thing. They look equally authoritative. In fact, they look almost identical! It can be really hard from the outside to see which is the fake report and which is the real one.

JW: So how do you tell the difference? Look at cases where you have people in the field who’ve come to consensus and then look at the critics. They may be well-credentialed, but usually not in the same field. An example is when oceanographers say and agree on something having to do with the ocean, but a physicist comes out and says they disagree. The physicist isn’t credentialed in matters of the ocean, but you may overlook that because the opinion is coming from a credentialed scientist. If the physicist really has a good argument, they should publish it in an oceanography journal and convince the oceanographers!

CO: Another big hint that something may be fishy is to see who is funding it. Follow the money. When there is apparent disagreement among scientists, but all and only scientists on one side have industrial funding, then you should be pretty suspicious.

What are some common sense changes that we can make to address the problems you discuss in your book? Do we have answers on how we can do this better?

JW: A few things. Journalists need to think about the role they play in spreading false beliefs and fake news. They need to recognize, for instance, that the fact that there exist people willing to dispute certain scientific claims is not, by itself, newsworthy. The source of the disagreement really matters. They also need to focus science reporting on consensus beliefs, as reflecting in institutional reviews from groups like the IPCC and the National Academies, and now on individual studies. And they need to get out of the mindset that there are two sides to every story, each equally deserving of attention.

JW: Another big problem is when journalists unwittingly spread false beliefs by trying to refute them. When some politician says something blatantly false, the 500 people in the audience hear it. But when the large news organization reports that they said this false thing, it’s now got a huge audience, even if the point of the article is to say that the statement was false. And the bigger point is that the false statement probably was not newsworthy to begin with. So we’re giving more attention to a false thing. Fact checking on news sites can do this in a major way.

CO: The arguments we give above about how easy it is for industry to influence science and public belief suggest that when there’s something controversial, with possible industrial interests at stake, industry shouldn’t fund science. They’re too good at funding science in tricky ways. Their money has to be out of the conversation if we want to trust the science.

JW: Exactly; the way scientists are funded and incentivized should change. Right now, there are very strong incentives for publishing papers that have surprising or exciting conclusions. There’s low incentive for replicating those results, and null results don’t get published. We need to recognize the importance of building a body of evidence that includes the results that don’t show the exciting new hypothesis. Those need to be funded and rewarded. Also, because the exciting, unlikely looking results get attention and rewarded, scientists are incentivized to produce lower quality science – fewer participants, lower statistical power. And scientists are strongly incentivized to do studies that give you that threshold. You want there to be disagreement between scientists in early studies.

CO: On the same note, you don’t want studies to conflict when everyone agrees. What would be best for public belief and preventing attacks by propagandists is for scientists to combine a lot of data and publish it all together. Instead of a National Academy study to be done 15 years after all of this stuff exists, what if those outliers never appeared along the way? Put all of that data in one piece – a collection that puts it all into context rather than allow people to cherry pick stuff that fits their agenda.

connect with us

         

© UC Irvine School of Social Sciences - 3151 Social Sciences Plaza, Irvine, CA 92697-5100 - 949.824.2766