Search

Voting is over, but the disinformation machine is humming right along - Marketplace

poloong.blogspot.com

Votes in the presidential election are still being counted. Now, the legal challenges have started, and the disinformation efforts that have defined 2020 are still ongoing. For example, there are no credible accounts of ballots suspiciously going missing or turning up or being destroyed. But influencers on various social media networks were spreading false narratives about just that on Election Day and beyond.

I spoke with Renée DiResta, the technical research manager at the Stanford Internet Observatory and part of the Election Integrity Partnership, which tracked misinformation and disinformation efforts in real time this week. For the most part, she told me the big platforms — Twitter, Facebook and YouTube — did all right, but there will always be work to be done. The following is an edited transcript of our conversation.

Renée DiResta is the technical research manager at the Stanford Internet Observatory.
Renée DiResta (Photo courtesy of DiResta)

Renée DiResta: I’ve never thought of misinformation and disinformation as a problem that we’re going to be able to solve, more of a chronic condition that we’re going to learn how to manage. And some of that is going to be people developing better media literacy and not believing what they see online quite so credulously as they perhaps did in the past. But also, the work that we did as a consortium of institutions, we built a system of tickets, where if an instance of misinformation related to voting appeared anywhere in the U.S. — in English, in Spanish, in Chinese — we had the ability to do a rapid response assessment, try to understand what was happening, if it was going viral, if it was hopping from platform to platform, and then how we should assess it, and who was best positioned to respond to it.

Molly Wood: That seems like good news. Although, it also sounds like one of the key things that you saw in this election were repeat offenders, verified influencers spreading misinformation. How did platforms handle those accounts?

DiResta: That, I think, is an area that can stand to be improved. Those accounts would receive repeated labels, so rather than really actioning with anything that was a deterrent from future behavior, they would just kind of get a new label with the next offense. So that’s something I think that platforms are likely going to be evaluating, what happens when there are these chronic offenders. For the most part, I think social media platforms were extremely proactive, because nobody wanted to see viral misinformation negatively impact the belief and the legitimacy of the election and the trust in the process.

Wood: What about YouTube? People paid attention to, of course, the labels on Twitter and Facebook. YouTube sort of skirts some of this conversation. In your post-mortem, basically, you found the livestream channels on YouTube were a vector.

DiResta: Yes, livestreaming is very, very difficult to get attention to in the time in which it is happening, so that’s a significant challenge. Oftentimes, livestreamers have some sort of influence with their audience already. And so one of the things that we saw was gaming accounts that are not normally focused on politics, but have a large audience, using their stream to kind of repurpose old footage alleging that things that had happened in the past were happening right then. And you can tell, as you look at the comments, that people really do believe when this happens that they’re seeing something that is, in fact, live.

Fake livestreamers

So I think the challenge of how to address livestreams that are not really live is something that’s out there. It was something that we observed with the [George] Floyd protests also, where livestreamers, fake livestreamers, were posting to Facebook Live insinuating that protests that were happening in certain parts of the country were violent, when in reality they were showing violent footage from past events. So that is a really significant challenge. I think YouTube took down a couple of the streams so they couldn’t survive after the fact as content to be shared. But in the moment, that’s a significant area where misinformation can reach a large number of people who often trust the streamer.

Wood: Speaking of narratives, what do you expect to emerge over the next few days and even weeks, and what are the best ways to combat those narratives in advance?

DiResta: I think for as long as the election remains unresolved, we’re going to continue to see narratives move in line with where the contested state of the day is. And then the other thing that we’re really seeing already, that we’ll be talking about quite a bit, I think, in the next few days, is the reemergence of misinformation incidents and delegitimization themes that refer to earlier allegations. So areas where the groundwork has already been laid months in advance insinuating that if a state didn’t go for a particular candidate, usually President Trump, that that meant that the result had been stolen, there had been some sort of conspiracy afoot. And so those narratives began to reemerge. And again, you heard the president himself say in his speech that he felt that the election was being stolen from him.

Wood: Do you have a few narratives that you know are likely to come up that people should — people being me, journalists, but also just everyone on Twitter and Facebook — what should we watch for and go, “We already know this is not true”?

DiResta: I think for journalists, it’s really important to just immediately, when there is something that is not true, to kind of intercede. There are networks who carried President Trump’s speech and didn’t immediately kind of quite clearly articulate that the claims that were made in the speech were false. I think that that’s going to be a role that journalists are going to have to play going forward, particularly local journalists in swing states, who are going to find themselves seeing narratives that had been already addressed come back from the dead. We’re seeing claims that have already been fact-checked — the fact check already exists somewhere, either a local journalist, editor or one of the social media platform’s fact-checking partners went through it. But we’re seeing them pop back up again, stripped of the context of the fact check, trying to, as the world is paying attention, as residents of that state are paying attention, trying to make it trend again, trying to delegitimize the direction that a particular state is trending. So I think that is a really key area that journalists have a major role to play in saying, “Nope, this has already come up. This has already been addressed. Here is the truth.” And move back to focusing on what is demonstrably true, and what the current state of events actually is.

Wood: Broadly speaking, people who are very online, let’s say, have an awareness that disinformation exists and is part of the landscape. But I wonder how widespread that awareness really is, like, do you have a sense of whether people know, understand and accept that disinformation is now just part of the game, and you have to be on the lookout for it?

DiResta: I think that there’s been a lot of coverage, particularly from the media over the last four years, related to foreign interference. And almost now to the point where the coverage is sort of disproportionate to the actual threat. Meaning, there are always going to be foreign-state actors that are going to be interfering in the domestic discourse, and running bots and sock puppet accounts, and so on and so forth. I don’t see that stopping anytime soon.

Domestic social media influencers

But the real significant narratives don’t emerge from those relatively tiny operations. They’re instead emerging from domestic influencers. And I think that reemphasizing that kind of critical thinking, the realization that just because you may like someone or trust them doesn’t mean that they get everything right 100% of the time. Really, the need to understand that content goes viral, because people feel compelled to share it, because they have an emotional response to it. And then oftentimes, they haven’t actually read the article. Maybe the article takes a much more nuanced stance than the headline did. I think that there’s a lot for us to adapt to with regard to social media as one more channel in the broader information ecosystem, in terms of how we receive and act as sharers ourselves at this point.

Related links: More insight from Molly Wood

The New York Times has a piece with more on how the platforms fared, some of the comments from Stanford’s post-election debrief, and other disinformation watchdogs who still feel the platforms could do more. It was a little disturbing to watch Facebook basically work out its labeling policies in real time — first saying it would label any posts that seemed to declare that counting was over or declaring victory in the election, then saying it would not label posts where a candidate declared victory in a specific state, which observers pointed out was not helpful in an election where it looked likely for a while that one specific state could decide the whole thing.

And then on Wednesday, Facebook put out guidance saying it would continue to label posts that declared victory or that counting had ended whether they related to a single state or the entire country.

But again, this is the platforms dealing with things fairly well. Anyway, stay alert, keep your eyes peeled, think before you tweet and go take a walk or a nap, friends. This is stressful.

Let's block ads! (Why?)



"along" - Google News
November 05, 2020 at 05:23PM
https://ift.tt/32g0yce

Voting is over, but the disinformation machine is humming right along - Marketplace
"along" - Google News
https://ift.tt/2z4LAdj
https://ift.tt/35rGyU8

Bagikan Berita Ini

0 Response to "Voting is over, but the disinformation machine is humming right along - Marketplace"

Post a Comment


Powered by Blogger.