Weapon of Choice

Photo: Weaponized Narrative Initiative, Center on the Future of War, Arizona State University, 21 March 2017.

“Words are weapons, sharper than knives.” – INXS

Recently, there has been much in the news about Russian interference during the last presidential election.  Of particular interest to both the intelligence community and Congress is the way in which Russia purchased advertising and conducted hundreds of millions of interactions with Americans on social media platforms.  There is a temptation to dismiss these concerns as trivial, but to do so would be a grave mistake.  The age of weaponized narrative is upon us, and it presents what Justice Oliver Wendell Holmes Jr. would call a “clear and present danger” to our national security interests.  As English degree holders, we have the skills to not only recognize and critically assess these narratives but also to intervene in their proliferation.

The weaponization of narrative can be most clearly demonstrated by examining what has become to be known as the Russian “Gerasimov doctrine.”[1]  While traditional military doctrine generally frames warfare as a progressively linear escalation of armed conflict, the Gerasimov doctrine emphasizes achieving strategic objectives through the employment of non-conventional means utilizing varying degrees of intensity.

Gerasimov states, “In the 21st century we have seen a tendency toward blurring the lines between the states of war and peace. Wars are no longer declared and, having begun, proceed according to an unfamiliar template.”  More importantly, he goes on to state that “the role of nonmilitary means of achieving political and strategic goals has grown, and, in many cases, they have exceeded the power of force of weapons in their effectiveness… All this is supplemented by military means of a concealed character, including carrying out actions of informational conflict” (emphasis mine).

Seen through the lens of the Gerasimov doctrine, weaponized narrative is deployed through traditional media as well as pervasive and predominant social media platforms in order to disrupt a nation’s social fabric by focusing on proliferating misinformation, undermining trust, and amplifying rifts along lines of race, gender, class, creed, and political affiliation.  Russia demonstrated their proficiency with deploying weaponized narrative during its 2014 annexation of Crimea, leveraging media outlets to achieve these outcomes, and this was most certainly the intent of their efforts throughout the 2016 presidential election – efforts that continue to this very day.[2]

There are certainly those who doubt that such a relatively small amount of purchased social media ads and deceptive interactions could have such a disproportionate amount of influence during the past election, and certainly, it is nearly impossible to quantify what that influence might have been.  But during recent congressional hearings, attorneys for Facebook revealed that the Russian Internet Research Agency purchased content that reached 126 million people – the equivalent of half the Americans eligible to vote.[3]  Additionally, consider the following:

• In the September 2012 issue of Nature, a researcher named James Fowler received permission from Facebook to display different targeted ad types to 61 million users during the 2010 congressional election encouraging them to vote.  Those who just got an informational banner encouraging them to vote showed no overall increase in their likelihood to vote (as did those who saw no message at all).  But those who got an informational banner that also showed the profile pictures of up to six randomly selected Facebook friends who had clicked the “I voted” button showed a marked increase in likelihood to vote that equated to a direct increase in voter turnout of 60,000 people and indirectly influenced another 280,000 people to vote.

• In 2010, the advertising firm of Chong and Koster used targeted Facebook ads, banners, and pre-roll spots in a successful attempt to defeat a statewide ballot initiative in Florida.  After the initiative, Chong and Koster hired polling firms to analyze the effectiveness of their ads.  What they found was that “the people who self identify as Democrats who should have voted our way were 10 percent less likely to vote for it than people who were heavy web users on Facebook.” Further, according to Koster, “people who were heavy web users on Facebook, including Republicans and independents, outperformed Democrats on a Democratic issue because of the online ads.”[4]

For overall effectiveness, not only was this ballot initiative the first ever election to win using solely online ads to communicate to voters (winning the firm a 2011 “Best Use of New Technology” Gold at the Pollie awards), but also it was found that there was a 19% difference in voter behavior in areas where the ads ran.  “Where the Facebook ads appeared, we did almost 20 percentage points better than where they didn’t,” said Koster. “Within that area, the people who saw the ads were 17 percent more likely to vote our way than the people who didn’t. Within that group, the people who voted the way we wanted them to, when asked why, often cited the messages they learned from the Facebook ads.”[5]

• As reporter Craig Silverman has noted, “In the final three months of the U.S. presidential campaign, the top-performing fake election-news stories on Facebook generated more engagement than the top stories from major news outlets such as The New York Times, The Washington Post, The Huffington Post, NBC News, and others.”[6]

Taken together, these three cases establish that social media has the potential to both rally voter turnout and influence voter behavior, and that the deployment of deliberate misinformation on social media can also influence political processes by sowing distrust and creating a barrage of competing false narratives.  More importantly, they also demonstrate that the weaponized narrative deployed by Russia during the 2016 presidential election achieved the exact outcomes prescribed by the Gerasimov doctrine.  Certainly, the proliferation of carefully targeted disinformation was able overwhelm legitimate news sources, creating confusion and undermining trust in institutions – hence the incorporation of phrases like “fake news” and “alternative facts” into our daily lexicon.  And as for being able to leverage social media in order to exploit racial, gender, class, creed, and political divisions, I think each and every one of us can empirically speak to the divisiveness that characterized exchanges on social media platforms during the election.

As English degree holders who have been trained to critically analyze texts and carefully evaluate source material, we are uniquely positioned to combat these weaponized narratives.  We have the tools and the training to apply the same kind of intellectual rigor and discipline to our online behaviors that we would apply to literary texts and formal publications.  By that, I mean we have control of what we like, share, and interact with; we can refuse to contribute to the spread of these narratives and have a responsibility to discredit them.  As lovers of literature and language, we all acknowledge that nothing has the potential to be more powerful than the written word, and in these times, it is incumbent upon us to safeguard its integrity.  The future of our nation may depend upon it.

Further reading:

• On the Gerasimov Doctrine: https://www.politico.com/magazine/story/2017/09/05/gerasimov-doctrine-russia-foreign-policy-215538

• On weaponized narrative: http://www.defenseone.com/ideas/2017/01/weaponized-narrative-new-battlespace/134284/

• And: https://www.theatlantic.com/magazine/archive/2016/11/war-goes-viral/501125/

– Dan Smith (B.A. ’11),  30 November 2017

Dan Smith (B.A. ’11) is a U.S. Army Space Operations Officer with 14 years of service currently conducting space systems engineering research at Johns Hopkins University Applied Physics Laboratory.  He received his B.A. in English from Kansas State University in 2011 and also has a M.S. in Space Studies from American Military University.  The views and opinions presented here are entirely his own and do not reflect those of the Department of Defense, the United States Army, or the Johns Hopkins Applied Physics Laboratory.

[1] Articulated by General Valery Gerasimov in a February 27, 2013 article in the Russian weekly trade paper Military-Industrial Kurier.  It is important to note however, that Gerasimov was not the first to articulate these concepts, nor is this technically Russian military doctrine – more precisely, it is the way the Russian military has broadly come to think about warfare.

[2] Visit http://dashboard.securingdemocracy.org/ to see an online dashboard of near real time propaganda efforts linked to Russian influence campaigns on Twitter.  This dashboard is a tool provided by the Hamilton 68 team and the Alliance for Securing Democracy, a bipartisan, transatlantic initiative housed at The German Marshall Fund of the United States.

[3] https://www.theguardian.com/technology/2017/oct/30/facebook-russia-fake-accounts-126-million

[4] https://www.dailydot.com/layer8/facebook-advertising-election-study/

[5] https://www.facebook.com/notes/government-and-politics-on-facebook/case-study-reaching-voters-with-facebook-ads-vote-no-on-8/10150257619200882/

[6] https://www.theatlantic.com/technology/archive/2017/10/what-facebook-did/542502/

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s