If you’ve retweeted or shared a provocative meme or unconfirmed information about COVID-19, you could be enabling public opinion hacking.
Public opinion hacking has hit a fever pitch and should intensify even more leading up to the November general election.
This week’s virtual Black Hat USA 2020 conference featured a keynote on how information operations are working overtime to manipulate public opinion. Renee DiResta, research manager at Stanford Internet Observatory, heads up research in this area.
Jeff Moss, Black Hat founder and director, said there’s not enough research on public opinion hacking to inform policymakers and “tell us what to do about it.”
“This one social media company thinks they found the solution,” he said. “Another one thinks they’ll label fake news. And another thinks they’ll ignore fake news and let the wisdom of the crowd tag the news and fix it for us. Everybody has a different approach. And that’s exciting because we can test a lot of hypotheses. But we don’t actually have enough academic, rigorous work being done.”
DiResta said her group studies the abuse of current information technologies with a focus on social media.
“Information operations increasingly involve the full spectrum of overt to covert propaganda, mass media as well as social media, agent influence activities, and at times network penetration,” she said.
Information Operation Tactics
Misinformation is in the news a lot lately, particularly related to COVID-19, DiResta said. It’s information that’s inadvertently wrong and “people are sharing it because they want to inform their communities, she said.
Disinformation is deliberately misleading, she said. So the person who’s sharing it has the intent to influence and deceive.
“They know the information is wrong,” DiResta said. “They know that it’s misleading or maligned, or not coming from the source they’re claiming it comes from. But they’re sharing it anyway.”
Propaganda is information with an agenda, she said. The specifics of the agenda vary, but the intent is to persuade someone or distract them, or make them take an action or feel a certain way, she said.
And finally there are agents of influence, or people who work to influence an audience, DiResta said.
“And unbeknownst to that audience, they’re beholden to somebody else,” she said. “They’re operating in service to a powerful figure.”
Russia Is ‘Best in Class’
Russia, at the moment, is the “best in class” for information operations, DiResta said. The country has demonstrated not only full-spectrum propaganda, but far more sophisticated activities related to agents of influence, media manipulation and network infiltration, she said.
“Russia has been able to not only hack public opinion by working the social ecosystem, but hacking public opinion by hacking public officials and institutions, and using the information it obtains in information operations deployed on broadcast and social media,” she said.
Much of Russia’s efforts focus on getting unwilling participates to help spread their communications, DiResta said.
In terms of the general election, several tactics will accelerate in the next few months, she said. Those include hack-and-leak operations, possible voting machine hacking, the infiltration of groups and the amplification of narratives, she said.
“Even if not a single vote is changed, releasing the information claiming that you have successfully hacked a machine will cause havoc, DiResta said.
Ultimately the goal will be to undermine confidence in the legitimacy of the election, she said.
“When we talk about information operations, it’s important to note that these personas and their materials resonate because of underlying, existing societal divides,” DiResti said. “You can’t hack a social system if …