Lucas Jackson—Reuters
By Mark Thompson
July 9, 2014

The first warriors fought on the ground. Then, someone hollowed out a log and naval warfare began. Aircraft came next, followed by space—and now, cyberspace. So it should come as no surprise that the exploding corner of cyberspace—social media—is the next battleground.

The fog of war now includes rolling clouds of Tweets, Facebook posts and Instagram photos that the Pentagon wants to filter, track and exploit. Enveloping the globe, from friends and foes alike, the torrent of data can serve as an early-warning system of trouble brewing—or a leading indicator of imminent action by a potential troublemaker.

That’s why the Defense Advanced Projects Research Agency has spent three years and $35 million, plumbing pixels as part of its Social Media in Strategic Communication (SMISC) program. Makes sense that DARPA’s in charge: the agency basically invented the Internet. “Events of strategic as well as tactical importance to our Armed Forces are increasingly taking place in social media space,” DARPA says. “We must, therefore, be aware of these events as they are happening and be in a position to defend ourselves within that space against adverse outcomes.”

Britain’s Guardian newspaper suggested Tuesday that the program might be connected to papers leaked by NSA whistleblower Edward Snowden showing “that US and British intelligence agencies have been deeply engaged in planning ways to covertly use social media for purposes of propaganda and deception.”

But Peter W. Singer, a strategist with the independent New America Foundation, sees it more as Defense Department due diligence. It appears to be “a fairly transparent effort, all done in the open, following academic research standards, aiming to understand critical changes in the social, and therefore, emerging battlefield, environment,” Singer says of DARPA’s efforts. “I am not deeply troubled by this—indeed, I would be troubled if we weren’t doing this kind of research to better understand the changing world around us.”

DARPA says researchers have to take steps to ensure that “no personally identifiable information for U.S. participants was collected, stored or created in contravention to federal privacy laws, regulations or DoD policies.” It issued a statement Wednesday declaring it was not involved in the recent Cornell University study of Facebook users, and that the work it has funded “has focused on public Twitter streams visible and accessible to everybody.”

The program’s aims, according to DARPA:

  • Detect, classify, measure and track the (a) formation, development and spread of ideas and concepts (memes), and (b) purposeful or deceptive messaging and misinformation.
  • Recognize persuasion campaign structures and influence operations across social media sites and communities.
  • Identify participants and intent, and measure effects of persuasion campaigns.
  • Counter messaging of detected adversary influence operations.

The goal is to win without firing a shot. The agency cited, without elaboration, an incident that it said occurred solely on social media as an example of what it wants to do:

DARPA’s lengthy research roster (at least those publicly available; there’s no link to IBM’s Early Warning Signals of System Change from Expert Communication Networks, for example) doesn’t detail anything about waging war. It’s all about tapping into those who use social media, how to figure out who their leaders are, and perhaps sway their thinking. Academics and computer scientists, working for major universities and outfits like SentiMetrix (which says its “sentiment engine has been proven to work in predicting election outcomes, conflicts, and stock price fluctuations”) have written more than 100 papers on a wide range of topics:

Cues to Deception in Social Media Communications

The Language that Gets People to Give: Phrases that Predict Success on Kickstarter

Understanding Individual’s Personal Values from Social Media Word Use

The Digital Evolution of Occupy Wall Street

A Computational Approach to Politeness with Application to Social Factors

If it seems difficult to discern a pattern here, that’s because the agency engages in basic research. It only builds the tools that others will use to build the next war (or disinformation) machine. There’s no telling which of these reports—if any—contains a glimmer of military utility. The only way to find out is to continue such research until it yields a breakthrough, or until the Pentagon goes broke.

 

Write to Mark Thompson at mark_thompson@timemagazine.com.

SPONSORED FINANCIAL CONTENT

You May Like

EDIT POST