Reigniting Trust

Why Do People Share Untrue Stories?

Jieun Shin, Ph.D., explains the power of false information, the role of algorithms and what we can do to curb misinformation.

Jieun Shin
Jieun Shin

Jieun Shin, Ph.D., is an assistant professor in the Department of Telecommunication at the University of Florida. She conducts research on digital media, with a focus on social interaction, information flow and networks. Shin is particularly interested in how people create, share, and process information. The interview has been edited for length and style.

Tell us about your work and your research.

I look at which stories are shared more than others, and why and how people share them, especially on social media and other platforms. That is how I started. That is a main research theme of mine. I look at how messages get diffused, how people share stories with varying degrees of information. That includes clickbait stories, “fake news” and all these rumors.

| Click here to read about Jieun Shin’s research. |

Let’s talk about sharing. Tell us what your research has shown about how and why people share.

My dissertation and a couple of the first projects that I embarked upon in my doctoral program was studying political rumors. Back then [2012], we did not have the term “fake news.” It was still to come, so we had to call them “political rumors.”

That was the election year when Mitt Romney and Barack Obama competed. My main technique to do research is the computational method. People leave digital footprints without knowing it. They click the “like” button and share without really knowing that there is a possibility for other people to collect their data. I collect these digital footprints and try to learn the behaviors, and then predict how they are going to behave in the future.

I collected Twitter data back in 2012. I filtered all these political rumors that were circulating on Twitter. Computer scientists use machine learning without actually looking at the detail of the content. I tried, as a communication scholar, to get the details on the contextual level. We looked at how true rumors and false rumors behave differently. They are both negative toward the same target, as this case Romney, but they behave completely differently.

For instance, there was one rumor that said Mitt Romney owned a company that makes vote counting machines. The vote counting machine rumor is false. The other rumor was that Mitt Romney’s slogan was actually used by KKK. That one is also completely false. False rumors do not give up. They keep coming back, and people add more details, more interesting elements to the original story, and it just keeps coming back.

One of the true rumors was that Obama’s daughter went to Mexico with 25 secret service agents on a school trip. That was actually true, and the White House said that was true, but asked the media to remove the story for safety concerns. The moment that Obama admitted that, [it]      disappeared. It is like people said, “I do not want to talk about it anymore.” The truth, the facts, do not interest people. When it is not verified, when it is out there and has not been confirmed or it is difficult to verify, they go loud and share constantly.

Rumors like the vote counting machines and the KKK slogan, those just kept going because nobody ever addressed them. They just kept going and people added to them and so forth. Something like the Mexico trip for Obama’s daughter, he came out, he addressed it, it turned out to be true and people lost interest.

Why do you think that is?

I do not know. That’s what I am trying to find out! I think fake news and false information is inherently interesting. It is almost as if there is unlimited imagination. People can edit it. They can add another ingredient. It is almost like a collective work.

Tell us more. What else have you studied?

I wanted to look at fact checking, how people share fact-checking stories.

It is pretty much the same. Conservatives share fact checking that is favorable to their candidate. Liberals also selectively share information that is favorable to their candidate. I was really disappointed to find out that it is all the same.

If everything is already determined by political beliefs, then doing research is basically not helping anything. So, what is my role and what is my contribution? I had a crisis!

What you are saying is that people only share fact-checking material that favors their point of view.

Yeah, that is true. But there are little nuances to that. The fact checking audience is overwhelmingly liberal. My research suggested that only less than 30 percent are conservative. The conservative, in general, they do not trust fact-checking done by the mainstream media. That is the challenge for fact checkers to overcome.

What does your research tell us about how people trust media?

Well, trust is everything, I think, because it really does not matter whether the content is true or false, although false items are more interesting. If you are conservative, you are going to believe in sources like Fox News, who share the same beliefs. You immediately trust that source, and you are going to spread it. Liberals have more trust in MSNBC, and they are going to be more likely to believe what they say and share it.

Media trust is everything. That is their input. If people believe them, the sources have power.

There is some research, which I did not do, that stirs some hope. If you nudge people before they share and say, “please think about the consequences, think about the accuracy level of the content,” the research shows that it improves the quality of sharing. They are more likely to share only verified information.

If you prod them and say, “Hey, please think about it,” it actually makes a difference?

Yes. When it comes to actually correcting all these phenomena, that is one thing that works. I myself also found that when you fact check, when you really want to correct misinformation, do not repeat the error. If you say, “Obama is not Muslim,” it is going to actually strengthen the rumor, the false information. Instead, you can say “He is Christian,” if you want to correct it. If “Muslim” and “Obama” are in proximity, they are going to be associated.

So, bottom line: what is it that causes people to share?

It indicates their membership, their crew. If you are a conservative, you want to show support. It is like a cheerleading group.

When you share, you give an impression that you are in the know. You want to give other people the impression that you are a news junkie, and that you read a lot. It’s as though they are saying, “I found something—and I am the first one to let you know.”

How did you begin researching health-related communication?

My research was focused on political misinformation. But, as I said, I had an identity crisis. What am I doing? The political belief information is all the same. There is nothing I can do. People believe what they want to believe.

I set out to look at something different, like health information. That can actually kill someone. Political beliefs are something, but they do not kill, right? I wanted to look at health misinformation, so I looked at why people share anti-vaccine messages. It is easy to spread misinformation. Again, there is no limit to the imagination. You can just add whatever fear-mongering you like—it is fun.

I went to Amazon because, if you are really serious about some topic, you buy a book. I’m not talking about browsing. If you are really committed to something, whether it is gardening or whether it is health, you buy a book.

So I collected data for all the vaccine-related books on Amazon. The majority of books that are available are anti-vaccine. If you think about it, it makes sense because being pro-vaccine is like, “Duh.” You do not really need to write a book about it. If you want to publish something, it has to be something different. So you write a book that is anti-vaccine. It is profitable and you can build your own audience. Surprisingly, a number of anti-vaccine books are written by medical professionals.

The books are biased towards the anti-vaccine books. There are more anti-vaccine books than pro-vaccine books. What is concerning is that if you use Amazon, they always recommend other books. The recommended books tend to be anti-vaccine books. Of course, the algorithm does not know. It only knows that people who bought this anti-vaccine book bought a bunch of other anti-vaccine books. They just keep recommending anti-vaccine books, which aggravates the problem. It advertises more anti-vaccine books and those get more purchases. It just spirals.

That makes sense. You are saying that algorithms can feed into misinformation.

Yes. That is a huge part of it. It happens with political information, too. Misinformation starts trending according to the algorithm and there it is. The algorithm’s role is clear. There are a lot of hidden pieces that drive misinformation diffusion, and algorithms play a huge part.

Let’s switch gears a bit and talk about social media platforms. You did some research there as well.

People are connected to each other and what their friends share is very important. A lot of people are very busy and trust the friends they are connected to. Therefore, what they share carries extra weight.

We are planning to do a combination of survey and experiments. We want to identify which platform is associated with a higher level of misinformation. It will be like a study to get a sense of which company is perceived as a misinformation producer.

You are going to look at Twitter and Facebook, who else you are going to look at?

YouTube, Instagram and major companies like TikTok. We’ll ask the extent to which they think there’s a level of misinformation there, what is the likelihood you stumble upon misinformation while browsing there. Of course you have to take it into consideration whether they are a user or not. If you are not a user, you may overestimate or underestimate.

We also want to look at whether making an effort makes a difference. For instance, if Twitter says they are going to spend $2 million to create an advisory board and partner with third-party fact checkers to curb misinformation, would it change the audience’s perception? Will it increase trust?

So, this will give the major platforms some feedback, right?

Right. We call it Signaling Theory. It gets a little complicated, but what Signaling Theory says is that if it is truly costly and risky for the company, the riskier and the more costly, the more effective. We are also going to look at the impact of whether the company is voluntarily taking the risk versus whether they are doing so because of external pressure. We are thinking voluntary expenditure will increase people’s trust more than when they are forced. Investment, spending more money, will increase people’s perception of trust rather.

Overall, what is your sense of how social media platforms and sharing have impacted trust?

I think it depends. The research shows a mixed finding. I cannot say either way.

I think it is a good opportunity for scientists and experts to gain trust by directly communicating to regular people. I have seen it happening during the Covid. A lot of scientists are trying to unpack the story with data. They bypass the media. They directly communicate to other users. I think that is very attractive. I think it has the potential to increase trust. Of course, anybody can join and pretend to be a legitimate actor. That is the negative side, which can lower the trust.

The algorithms have both potentials. If the algorithms get smarter and become more socially responsible, they can help. Otherwise, they can make things worse. It really depends.

It is a fascinating time to study all this phenomenon. Back in 2012, I had to use literature that went back to the 1920s and 1930s. There was not much research. Now, if you search on “misinformation,” there is an explosion of research done in the past few years. People’s interest is there. Enthusiasm is there. So, I think there is some hope.