© 2024 254 North Front Street, Suite 300, Wilmington, NC 28401 | 910.343.1640
News Classical 91.3 Wilmington 92.7 Wilmington 96.7 Southport
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
CAPE FEAR MEMORIAL BRIDGE CLOSURE: UPDATES, RESOURCES, AND CONTEXT

Trump Is No Longer Tweeting, But Online Disinformation Isn't Going Away

A woman wears a T-shirt that reads "Fake News" as protesters gather near the Indiana Statehouse last November for a #StopTheSteal rally and to protest Joe Biden's election victory over Donald J. Trump.
Jeremy Hogan
/
SOPA Images/LightRocket via Getty Images
A woman wears a T-shirt that reads "Fake News" as protesters gather near the Indiana Statehouse last November for a #StopTheSteal rally and to protest Joe Biden's election victory over Donald J. Trump.

Darren Linvill thought he was prepared for 2020 and the firehose of false information that would come flooding down on the United States during an election year in which the country was bitterly divided.

Linvill is a researcher at Clemson University in South Carolina and he tracks disinformation networks associated with Russia.

In the years following 2016, Linvill shared his work with a number of government entities as the U.S. worked to figure out exactly what Russia did to interfere in that race. He even designed a game called "Spot The Troll" that shows how hard it is to tell a professional provocateur from an extremely opinionated American.

People often flunk the test, but more importantly Linvill then tracks their online behavior after they take it, and they seem to be more discerning in what information they choose to share and promote.

"They realize, 'Oh, maybe I'm not as smart as I thought I was,'" he says.

Of all the people watching the political landscape in 2020, he should have been ready for whatever disinformation the year had to offer. But he wasn't.

"The minute the pandemic hit," says Linvill, "s*** hit the fan."

Instead of monitoring a wave of foreign disinformation seeking to sow mistrust in democratic institutions and elections, domestic sources doing the same thing surged instead.

"I'm not even seeing [Russian disinformation] messaging much in English to the same extent that I've seen in the past, because they don't need it," Linvill said. "I mean, the GOP has taken the ball from them and run with it."

In the past year, Americans spent more time than ever online and got more of their information from unreliable or false sources. Even with the de-platforming of former President Donald Trump, experts say the way Americans communicate and receive information online remains broken.

It's a crisis that is ripping families apart and led to a violent takeover of the U.S. Capitol in January.

A recent report about that attack from the nonpartisan Election Integrity Partnership concluded that while it was appalling to watch, it should not have been viewed as surprising considering what was happening all year online.

"Many Americans were shocked, but they needn't have been," wrote the report's authors.

It's also not a once-every-four-years problem. Public health officials are currently competing with a deluge of online disinformation to convince the public that the coronavirus vaccines are safe.

"We are in serious trouble," said Joan Donovan, the research director of Harvard University's Shorenstein Center on Media, Politics and Public Policy. "Disinformation has become an industry, which means the financial incentives and the political gains are now aligned."

Trump may no longer have access to his 80 million Twitter followers, but the system he capitalized on to spread more than 30,000 falsehoods remains intact.

"We will see more of this," she added.

Defining the landscape

The pandemic has been miserable for millions of Americans who have lost loved ones in some cases and jobs in others. But it has inarguably been a boon for the tech world.

Twitter and Facebook have both seen meteoric rises in their stock prices since last March, matching a respective growth in time spent on their platforms. That growth may mean a public that is more receptive to conspiratorial thinking and less concerned with truth.

Even before the pandemic, engagement with social media was rising. Facebook, Twitter, Instagram and Reddit have all seen an increase in the amount of time people are spending on their platforms since 2018, according to Activate Consulting, a firm that tracks technology and media trends.

There was also a bigger jump in daily time spent with Internet and media from 2019 to 2020 than in any of the previous years Activate has tracked, according to the company's CEO, Michael Wolf.

The average Facebook user now spends about 15.5 hours per month on the platform. And overall, Americans are spending more than 13 total hours per day engaging with some sort of tech or media, whether that is video, gaming, social media, messaging or audio.

"New habits have formed," said Wolf. "It's just not likely that these behaviors are going to go in reverse."

That means information gleaned via algorithmically generated sources has become an ever more important part of Americans' news diet.

About 1 in 5 Americans say they primarily got their political news from social media in 2020, according to the Pew Research Center.

Those who got their information that way were found to engage with conspiracy theories more often than other Americans, while also voicing less concern about the detrimental effects of unreliable information.

The problem is more pronounced for younger Americans, who have grown up with the platforms. Of those Americans who relied most on social media for their information about the election, half were under 30 years old.

This week's Election Integrity Partnership report detailed how claims about voting fraud went viral in conservative circles, and subsequent fact checks garnered only a fraction of the same traction.

Even though government officials did their best to prepare Americans for what to expect on election night and beyond, conspiracy theorists inspired by Trump and his allies successfully painted those efforts to preempt the problem as further evidence of a rigged system controlled by a "Deep State."

It's not just election-related disinformation on the rise; false narratives about the coronavirus pandemic have also exploded.

The misinformation tracking company NewsGuard has compiled a list of more than 400 websites that are spreading lies about the pandemic.

The company also found that many of those same websites are being funded — unintentionally through automated advertising — in part, by some of the world's largest corporations and even the federal government's own Centers for Disease Control and Prevention.

"If advertising platforms were to provide easy tools for avoiding misinformation websites when placing ads, it would have a significant impact on the business model for such misinformation, vastly reducing the incentive for misinformation publishers to promote false claims," NewsGuard general manager Matt Skibinski wrote in the company's report on the issue.

Twitter suspended now-former President Donald Trump's account shortly after the Jan. 6 riot at the U.S. Capitol. While election-related disinformation online has diminished a bit since then, falsehoods about the election and COVID-19 vaccines remain popular.
Justin Sullivan / Getty Images
/
Getty Images
Twitter suspended now-former President Donald Trump's account shortly after the Jan. 6 riot at the U.S. Capitol. While election-related disinformation online has diminished a bit since then, falsehoods about the election and COVID-19 vaccines remain popular.

Chasing a symptom

Throughout his presidency, Trump repeatedly pushed the limits of social media companies' policies when it came to sharing false information.

Loading...

At first, the companies did nothing. Then they added fact check labels, although it remains unclear whether such labels help or hurt the spread of disinformation.

But what they didn't do was hamper Trump's ability to speak his mind, even as election officials warned that the sorts of falsehoods he was spreading would lead to violence.

"Someone's going to get hurt, someone's going to get shot, someone's going to get killed," Gabriel Sterling, a Georgia election official, said in December, less than a month before the a mob stormed the U.S. Capitol.

As Trump's rate of lies acceleratedlast year, so too did his Twitter follower count.

At the time of Trump's ban from the platform in January, his account had the sixth-largest number of followers. Over the course of 2020, Trump's account saw a 30% increase in followers, from 68.1 million to 88.7 million, according to a research team at the University of Colorado Boulder.

Removing him, and thousands of other accounts that spread misinformation, led to an immediate decrease in falsehoods spreading on social media, according to an analysis by the tracking firm Zignal Labs.

But that action won't magically fix the platforms, says Harvard's Donovan.

For instance, of the top 20 accounts that shared disinformation around the election using the hashtag #voterfraud, 13 accounts remain active on Twitter, according to a Cornell University data analysis.

Donovan says these sorts of accounts, like conservative media personalities Charlie Kirk and Jack Posobiec, both of whom have more than a million followers, can still make a fake narrative go viral almost immediately.

"Of the people who have spread the most noxious lies about the 2020 election, many of them retain their social media accounts on most platforms," said Donovan. "When you don't get the people who are authoring the fictions, the people who are behind the orchestration of that disinformation, then it will eventually come back in different forms."

Although individual members often raise concerns, Congress has thus far refused to push the social media companies to make wholescale reforms to the designs of their platforms. Those systems have notoriously been found to drive political polarization and reward misinformation with engagement.

"Everywhere along the way that these social media platforms have innovated, there has been a lack of accountability and rule-making from politicians," Donovan said. "Mainly because that kind of chaos serves many politicians."

Timeline for a fix

For Whitney Phillips, a disinformation researcher at Syracuse, there is some reason for optimism.

Even if it took one election with an unprecedented level of foreign interference, and another that concluded with violence at the U.S. Capitol, people are at least beginning to recognize that there's a problem.

"When I started doing this research in 2008, there was such an enormous amount of resistance that anything bad that happened on the Internet was even real," Phillips said. "And it wasn't until really 2017 that there was a critical mass of people who were like 'maybe hate speech on the Internet isn't good.' Maybe these things might correspond into real world action. ... Now there's really little denying the dangers of a dysfunctional information ecosystem."

Much of her new book, co-written with Ryan Milner, focuses on the role memes have played in normalizing hate speech and racism, by layering humor or irony on top of it.

When she's asked how much more moderation the major companies need to do to fix the current state of information in the U.S., she says the question misses the point.

A future healthy information environment probably doesn't involve Facebook or Twitter at all, at least in anything close to their current forms. It involves a completely redesigned Internet.

"My guess is that it will take us 50 years," she says.

That has meant she's shifted her focus away from platform moderation and toward K-12 education so that future generations might be better equipped to fix what they are left with: systems that let falsehoods spread like wildfire, without regard to truth.

"Our problem is that our networks are working exactly as they were designed to work. They work great. They're not broken at all," she said. "So in order to equip people to navigate these networks that are designed to set us up to be in hell, basically we've got to think about what are we teaching young people."

What happens between now and 50 years from now she's unsure about. Those users who have been recently radicalized, for instance, may find new ways to gather online if they are kicked off the major platforms.

"Something is going to grow from this," she said. "What exactly is hard to say. But I have a feeling that it's not going to be awesome."

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Tags
Miles Parks is a reporter on NPR's Washington Desk. He covers voting and elections, and also reports on breaking news.