How Russia ‘Pushed Our Buttons’ With Fake Online Ads

Psychologists and students of advertising say Russia effectively used fake ads to exploit divisions in US.
Image may contain Mailbox Letterbox Text Graphics and Art
HOTLITTLEPOTATO

If you buy something using links in our stories, we may earn a commission. Learn more.

Many Americans this week got their first looks at fake Facebook ads placed by Russian propagandists during the 2016 election campaign to sow discord in the US. The ads, made public during congressional hearings with social-media executives, targeted Americans on both sides of divisive issues such as Islam, gun rights, and the Black Lives Matter movement.

One ad, nominally from an account called Heart of Texas, showed silhouettes of cowboys behind a map of America with a rainbow flag and a poster about Islam taking over the world. “Get Ready to Secede!” the ad screams at the bottom. Another ad shows the somber image of policemen in uniform carrying a casket at a funeral with the words, “Another Gruesome Attack on Police By a BLM Movement Activist.”

Psychologists and students of advertising say the ads were cleverly designed to look like other internet memes, and to appeal to readers’ emotions. Jay Van Bavel, an associate professor of psychology at NYU, says he was surprised at the sophistication of the campaign. “It wasn’t transparent lies. It was just pushing our buttons,” says Van Bavel. “To me, this is more pernicious. It’s not a matter of fiction that we can root out with fact-checking. It’s more about turning Americans against each other.”

The ads took issues that voters care about and then “fed them to us as aggressively as possible,” he says.

Facebook estimates that 10 million people saw the paid ads and up to 150 million people saw other content from the fake accounts, which Facebook has traced to the Internet Research Agency, a Kremlin-backed troll farm. The ads were placed by fake accounts with names like United Muslims of America, Blacktivist, and LGBT United that could have passed for real Facebook groups.

“The IRA are not amateurs, they're clearly familiarizing themselves with the kind of content that resonates with the target audiences,” says Renee DiResta, researcher with Data for Democracy, a nonprofit group that has been digging into the data on Russian-linked accounts.

The ads did not look like the products of Madison Avenue. Rather, they camouflaged themselves in the vernacular of the Internet. Jennifer Grygiel, a communications professor at Syracuse University who teaches about memes, thinks the low-budget look is an engagement strategy. They want to make it appear as though the ads “could have been created by your average American. They don’t want glossy high production.” Grygiel said that ads from the LGBT United group reminded her of events she’s been involved in. The ad was plastered with rainbows and tells Facebook users, “I’m just really excited to go out and protest the Westboro Church!”

Grygiel also noticed the use of iconography like cowboys, American flags, and women in burqas in that Heart of Texas ad. “It was almost distilled to the point of it being pop art,” she says. “Essentially what they’re doing with some of these memes is like a culture mash. It’s almost like re-mixing American culture and in this case some American fears.”

The text of some ads included spelling mistakes and non-idiomatic English, but DiResta, of Data for Democracy, says relying heavily on images minimizes “the possibility of giveaway errors” that would become apparent in a longer post.

Van Bavel, the NYU professor, has studied a phenomenon he calls “moral contagion,” referring to the use of moral emotional language to help content go viral on social networks. He says tugging at those emotions tends to drive people deeper into ideological echo chambers, dynamics he saw at play in the Russian ads. “What you’re more likely to click on is stuff that triggers this part of the brain that is so primal,” he says. “Russians knows as much. They know how to pull us apart and agitate us.”

There’s nothing new about campaigns to manipulate voters, but Van Bavel believes says it can be more polarizing in the internet age because access to media is more fragmented and curated.

Malcolm Harris, author of a book about millennials called Kids These Days, says some of the ads had the same “campy and jokey,” but also weirdly extremist aesthetics found in corners of “the conspiracy web.” Harris says internet aesthetics are transnational, which could make it harder to identify their origin than, say, a movie. “There’s nothing that screams out not American,” he says.

He says they look more like the work of American conservatives than liberals. “The stuff on the left just tends to look like lower quality mainstream stuff,” says Harris, “whereas the right really has their own thing with memes and cartoons.”

Bruce McClintock, an adjunct policy analyst at the Rand Corporation and a retired brigadier general who served as the senior defense official at the U.S. Embassy in Moscow, says the ads resonate with Russian and Soviet tactics of other eras.

“It’s about spreading disinformation, propaganda, counterfeit official documents to increase confusion,” he says. McClintock says the goal of the campaign likely was broader than just the election and includes the long-term objective of weakening the US and undermining America’s reputation in the eyes of the world.

He notes that Russian operatives have been accused of inflaming racial tensions in the US before, including unconfirmed reports that the KGB sent fake letters from the Ku Klux Klan and spread conspiracy theories that the US government was behind the assassination of Martin Luther King. More recently, there was a KGB campaign that US scientists had developed HIV as a biological weapons experiment. This technique approaches disinformation like “a conspiracy theory incubator,” he says.