US election 2020: how to avoid being manipulated by fake news

You can enable subtitles (captions) in the video player

Russian bots, meme warfare, home-grown trolls and deep fakes. With all the fake news and misinformation out there you might feel there’s no way to trust what you’re hearing about the upcoming US election and its result. But that’s exactly what these hostile influences want you to think. The best chance they have at disrupting an election could be to undermine voter confidence.

When your goal is to create chaos and panic, the very fact of being caught doing electoral interference can actually be helpful.

What they’re hoping and what they’re trying to do is to use the fact that they’re getting caught to make us all think they’re everywhere.

By elevating trolls to this level we inject this paranoia into the political system where everybody believes everybody else is a Russian troll.

And it makes it so easy for people to demonise whoever is not them. And I think that’s why we see this rise of radical groups and extremist content.

This isn’t just an issue for US voters. We should all be on the lookout for online manipulation. But as the showdown between Donald Trump and Joe Biden reaches its climax here are some of the most worrying trends to look out for. Back in 2016 fake social media accounts were a favoured tool for hostile actors looking to infiltrate online communities and stir up political tension.

They would slowly talk about the most divisive issues of the political landscape and help amplify the divisions that existed in American society back then.

From 2014 to 2017 Russian fake accounts reached over 100m Americans with disinformation and spam. Since then, social media platforms have done much more to detect fakes lurking. Facebook, for example, routinely takes down twice as many fake accounts than it did at the start of 2018. But now artificial intelligence is helping fakers avoid detection in a game of cat and mouse.

AI generated photographs that are used for profile pictures for a fake account is something that is very much on the rise. And this is an area where it’s much easier to generate and produce those pictures than it is to detect them at scale.

AI-generated faces are often used to make members of fake groups seem real. But there are some tell-tale signs, like distorted backgrounds or weirdly misshapen details. If you spot any of these features in a profile picture chances are you’re onto a fake. But hostile actors aren’t stopping it fake accounts anymore. Increasingly, they’re looking to recruit real life social media users either knowingly or unwittingly, to spread their divisive messages online.

We recently took down a network being run by Russian actors that had created a fake media organisation and was luring legitimate journalists and freelancers into writing for them, so that they could use their authentic voices to amplify their message.

Peace Data was the smallest network ever linked back to a Russian operation. Just 13 accounts and two pages. But Facebook knows the impact authentic voices can have if the stories of hostile influences are picked up by news organisations and folded into the broader public debate.

In 2016, we saw a hack leak operations as part of the Russian playbook. And we should all be prepared and expect a leak of information about one of the candidates, about the election process, about the results, designed to distract, disturb and create fear in the public.

The most famous example of a hack and leak operation in 2016 came when Russian intelligence officers hacked the email accounts of Democrats linked to Hillary Clinton’s presidential campaign.

They have given that information to Wikileaks for the purpose of putting it on the internet.

Revelations from these emails dominated news coverage in the run up to the election and arguably took attention away from an alleged sex scandal that was harming Donald Trump’s polling numbers. Now it looks like hackers from Russia, China and Iran are trying to steal more information in order to get it into the American press. Just a few weeks ago Microsoft published a report outlining three separate unsuccessful hacking campaigns from these countries. Each targeted political parties, advocacy groups and election insiders within the US, including people involved in the presidential campaigns for both Joe Biden and Donald Trump.

The ability to change the way people discuss the election due to stolen information has not changed at all. Certainly a number of reporters themselves who work on disinformation are much more aware of what the risks are, but it doesn’t seem like most of the high-end editors in the really big media companies have understood that they made huge mistakes in their coverage of the leaked and hacked documents in 2016.

Alex Stamos was chief information security officer at Facebook from 2015 to 2018. He’s now a professor at the Stanford Internet Observatory. For him, a well-timed hack and leak could be more damaging than any online trolling campaign.

If I was a foreign adversary I would not focus on having hundreds of thousands of accounts where I try to amplify messaging. What I’d try to do is inject that narrative, get it covered by the hyper-partisan press in the United States, and then let Americans do all the work for me.

Since 2016 the range of organisations willing to spread misinformation has got broader. Online communities, campaign groups, even PR firms on behalf of clients. There are also more options in traditional media, thanks to an explosion of local news sites that fail to disclose their political affiliations.

There are networks of so-called local news sites that purport to cover local news in a way that benefits a congressional candidate or a presidential candidate or a state legislative candidate on one side or the other.

Research suggests that the number of these sites more than tripled across the United States in the first half of this year to more than 1,200. Other experts have found they operate in swing states, and offer candidates a way to skirt social media restrictions on political advertising.

In the case of Facebook you have to disclose. In the case of Twitter you can’t even do it. But if you’re a secretly financed local news site you can get onto Facebook and Twitter with your stories and you can promote them. And as election day approaches those kinds of sites are multiplying.

It’s too soon to tell whether these sites are deliberately misleading their readers, but researchers are concerned their very existence could accelerate political polarisation in the United States, and that could be just as damaging to the election.

I want to congratulate the Democrats on the rollout of their latest information warfare operation against the president.

In an environment where supporters of each political party distrust anything the other side says everything starts to sound like fake news.

We don’t really have the right language to talk about this stuff anymore, because everybody calls everything a conspiracy or fake news. It’s all conspiracies, you know. Russia-gate was a conspiracy, the Mueller investigation was a conspiracy, Russia’s attack on America was a conspiracy. In fact, all of these things are not conspiracies.

You want to tell me about what it is?

Yeah, sure. You tell me when. Q is about getting to the truth.

The QAnon conspiracy grabbed all the headlines this year. That’s the outlandish claim that a secret cabal of Democrat paedophiles is trafficking children and drinking their blood.

…now it’s children’s blood.

According to a recent online survey, 56 per cent of Republicans believe that at least some parts of a conspiracy are true. But since March conspiracy theories about voting by mail and voter fraud have been mentioned much more frequently on social media, including by the president himself. And while less outlandish than QAnon, these are the theories that could discourage voters.

In terms of voter suppression or demoralisation, some of it is, well the voting systems are all still hacked anyway so you can’t trust anything. Some of it is this, all the mail-in ballots that we’re sending out to people because of Covid, that’s all fraud, like it’s all going to be fraud, it’s all fake, which is just not true. And I still have total faith in the election system, but I think that focusing on these narratives just creates all this doubt and it does discourage a lot of people who are frustrated about the overall situation.

Convincing voters that there was no point even casting their ballots was one of the key tactics employed by Russian trolls ahead of the 2016 election. Now it seems that US citizens are carrying that message for them. At the start of October Facebook took down a home-grown network that brought all of the tactics I’ve talked about in this video into a single campaign.

Republican-affiliated group, Turning Point Action, paid a marketing firm to post conspiracy theories that cast doubt over mail-in ballots. The group also used fake accounts with AI-generated personas to comment on legitimate news articles, either attacking Democrats or supporting Donald Trump. Just like in 2016 it’s going to be almost impossible to pin down the exact impact of these manipulation tactics. But if Americans are convincing each other not to vote this election, then that will be seen as a win among hostile influences. Whether they succeed in their ultimate goal of undermining democracy could depend more on us than on them.

Making sure that we don’t increase the efficiency of information operations by exaggerating their impact or by suddenly becoming obsessed with them.

If they can get everybody in the country to believe that they have the power to throw an election then it does make people think maybe democracy is not worth it. That is the real goal of these nihilistic groups, and we should not do their work for them.

Don’t take for granted what you see, whether it’s on Facebook or Twitter or any social media platform or anywhere on the internet. Take a second click and look and see who is behind it and what they’re saying.

There’s a lot of content that might be totally true, but it’s there to piss you off and to make you post more content and piss other people off. But once you’re more aware of it, you are a little bit slower with the post finger, which is important. So that would be my number one thing.

Source link

Related Articles

Back to top button