Skip to content Skip to footer

Pentagon Funds New Data-Mining Tools to Track and Kill Activists

Drones are now being used on US soil by the FBI, Secret Service, Texas Rangers and some local police forces.

Part I

The US Department of Defense (DoD) is funding a multimillion dollar university research program to develop new data mining and analysis tools for the US military intelligence community to track political radicalism among British Muslims and other activist groups around the world.

Leading intelligence experts including former National Security Agency (NSA) official Thomas Drake – the whistleblower who inspired Edward Snowden – confirm that the tools are designed to enhance the intelligence community’s capabilities to identify potential terrorism suspects that could face a range of sanctions, from surveillance to no-fly injunctions to, at worst, being targeted for extrajudicial assassination via the CIA’s “kill lists.”

But, they say, inherent flaws in the program are instead likely to facilitate the criminalization of political dissent and the targeting of innocent civilians – and that such trends are increasingly likely to affect not just “hostile theatres” abroad, but even domestic populations in the US, Britain and Europe.

One flagship project established at Arizona State University (ASU) since 2009 examines “radical” and “counter-radical” movements in Southeast Asia, West Africa and Western Europe. This month, I obtained exclusive access to some of the online research tools being used by the Pentagon-funded project, disclosing a list of 36 mostly Muslim organizations in the UK targeted for assessment as to their relationship to radicalism.

The project’s most significant outputs have involved the creation of sophisticated data-mining tools capable of analyzing thousands of online materials, whether in the form of webpages, tweets or discussion forums. The ASU team, led by anthropologist Prof. Mark Woodward, has designed a range of algorithms and advanced modelling techniques to automatically categorize and rank political or religious organizations or networks, and individuals associated with them, on a “radicalism scale” to measure the degree to which they “threaten” US interests. The project is also capable of identifying and locating individuals and ranking their propensity for terrorism.

Unbeknown to many, Arizona State is officially an NSA-designated university. Its Information Assurance Center (IAC), based in the School of Computing, Informatics and Decision System Engineering – where the programming of data-mining tools for the Pentagon has occurred – is a certified National Center of Excellence in education and research by the NSA and US Department of Homeland Security.

Arizona State’s IAC also collaborates with the school’s Center for Emergency Management, which provides training for “homeland security professionals” in “government/industry” on “disaster mitigation, preparedness, response, recovery and management.”

A Network of Data Gathering

Another parallel project, which recently received funding from the DoD’s Minerva Research Initiative, led by Prof. Stephen Kosack of the University of Washington, “seeks to uncover the conditions under which political movements aimed at large-scale political and economic change originate, and what their characteristics and consequences are.”

“The main focus of the research is to try to understand when mass movements of average citizens arise and push for social change,” Kosack said last week. He said the team has gathered “historical data on populations undergoing mass social change” over the last three years “to create a fingerprint of the population.”

This will be extended to “larger and more diverse movements” of more than 1,000 participants, “including demonstrations from much of the Middle East, Taiwan, Ghana, and Brazil,” analyzed via 216 social variables.

Kosack acknowledged the direct implications for US defense policy. “Think of the run-up to the Iraq War in 2003,” he said. “There was a big debate about how likely Iraq was to produce violent insurgencies. This would be able to give you some more precise historical estimates of the likelihood of the environment of Iraq leading to a violent insurgency.”

Kosack’s research is being overseen by Dr. Lisa Traynor of the Social and Behavioral Sciences Program of the US Army Research Office. Traynor is a contract scientist at Bowhead Systems Management, a leading US defense firm providing services to the Pentagon, Army, Air Force, Navy, Department of Homeland Security and Department of the Interior, among other federal agencies.

Michael Bradshaw, president of Bowman’s Systems and Technology Group, previously served as a Pentagon contractor specializing in military war-gaming, simulation, training and software development. Bowman is also one of the leading firms contracted to develop the US military’s rapidly emerging Unmanned Autonomous Vehicle (UAV) and Unmanned Autonomous Systems (UAS) technologies.

Such technologies are pitched to transform the 21st century battlefield. A February report by Washington’s Center for Strategic and International Studies noted that unmanned systems – otherwise known as drones – played a decisive role in US “war on terror” operations in “Iraq and Afghanistan, a global campaign to disrupt Al Qaeda and its affiliates from Somalia to Pakistan, a contingency operation in Libya, and support to French efforts in Mali.”

The report adds: “Unmanned systems will also have new domestic prominence and importance for the United States as they are increasingly adopted for homeland and law enforcement missions.”

When Data Mining Leads to Drone Strikes

The link between US drone warfare at home and abroad, and tracking political “radicals,” is not trivial. Unmanned systems have increasingly been used to unilaterally target and assassinate alleged terrorism suspects and their associates, including US citizens such as Anwar Awlaki and his son Abdulrahman, based on Pentagon “kill lists” created by US intelligence.

Earlier this year, The Intercept reported on the pivotal role of the NSA’s “complex analysis of electronic surveillance, rather than human intelligence, as the primary method to locate targets for lethal drone strikes,” often resulting in “deaths of innocent or unidentified people.”

NSA whistleblower Thomas Drake, a former senior executive of the agency, confirmed to me that the algorithms being developed by ASU’s data-mining projects were similar to algorithms used by US intelligence to identify targets for the CIA’s “signature” drone strikes against unidentified groups of terrorism suspects. In the unforgettable words of General Michael Hayden, former NSA and CIA chief: “We kill people based on metadata.”

Tracking British Muslims and Anti-War Activists

The University of Washington and Arizona State University projects are both about enhancing the Pentagon’s capacity to analyze surveillance data to determine threats from groups and individuals, supplementing analysis of metadata with “open source” material. Unlike Kosack’s project, though, the ASU project claims to focus exclusively on Muslim organizations in Asia, Africa and Europe.

Its starting assumption is that the “driving motor” of extremist movements is religion, and therefore that researchers must move “beyond the assumption that counter radical discourse necessarily focuses on politics.”

In contrast, according to Emeritus Prof. Riaz Hassan of Flinders University, whose research on the motivations behind terrorism was funded by the Australian government, analysis of data from the most comprehensive terrorism database in the world recognizes that religious extremism is largely motivated by political grievance.

“It is politics more than religious fanaticism that has led terrorists to blow themselves up,” Hassan said. “The causes of suicide bombings lie not in individual psychopathology but in broader social conditions. Understanding and knowledge of these conditions is vital for developing appropriate public policies and responses to protect the public.”

In the ASU project’s latest paper, “Multiscale Modeling of Islamic Organizations in the UK,”, which was presented at an academic conference in Washington, D.C., last year, the study’s authors describe their development of “a ranking system that utilizes ranked perspectives to map 26 U.K. Islamic organizations on a set of socio-cultural, political and behavioral scales based on their web corpus.” In reality, the list of organizations studied included 10 more organizations, some of which were non-Muslim organizations.

Data for the Pentagon-funded research project consisted of a collection of nearly 10,000 documents downloaded from the websites of 36 U.K. organizations that were then ranked on various scales by three independent experts.

The project gathered some of its data via an “expert wisdom gathering tool,” a bespoke website for independent experts to grade and scale these organizations. According to the online tool, the organizations rated for their threat-level included largely peaceful civil society organizations such as British Muslims for Secular Democracy, Islamic Relief, Islamic Society of Britain and the Quilliam Foundation, as well as activist pro-Palestinian organizations which have been critical of U.K. government policy, such as the Muslim Public Affairs Committee UK, Cage Prisoners and Interpal.

Despite the claim of focusing on Muslim discourses of extremism, included on the list are non-Muslim organizations oppositional to US and British foreign policies – namely the 32 county sovereignty committee (32CSM), the Irish Republican separatist group often described as the political wing of the Real IRA; and the Socialist Workers Party, which since 2003 has run the U.K.’s leading anti-war network, the Stop The War Coalition (STWC).

STWC’s supporting organizations include the Palestine Solidarity Campaign, the National Union of Journalists, the Centre for Nuclear Disarmament, Friends of Al-Aqsa, the Green Party, Military Families Against War, Media Workers Against War, the Kurdish Federation, and Britain’s major trade unions, Unison and United.

Also listed for ranking are Islam4UK, the group formerly known as Al-Muhajiroun, and the far right English Defence League. A further group listed is “Christian Choice.” No group by that specific name appears to exist, but the designation could be a reference to the Catholic movement campaigning for women’s rights on abortion, led in particular by the Washington-based Catholics for Choice which is heavily critical of the Vatican.

The full list of British organizations scaled for their propensity for or against terrorism by this Pentagon-funded data-mining project is as follows:

• iEngage • MPACUK •32 county sovereignty committee •Al-Khoei Foundation •Bradford Council for Mosques •British Muslim Forum •British Muslim Initiative •British Muslims for secular democracy •Cage Prisoners •Christian Choice •European Muslim Research Centre •Federation of Student Islamic Societies Institute for Middle Eastern and Islamic Studies •Interpal •Islam 4 UK •Islam Expo •Islamic Centre of England •Islamic Forum Europe •Islamic Foundation •Islamic Human Rights Commission •Islamic Relief •Islamic Society Britain •Middle East Monitor •Mosques and Imams National Advisory Board •Muslim Aid •Muslim Association of Britain •Muslim Council of Britain •Muslim parliament of Great Britain •Muslims4UK •National Association of Muslim Police •Quilliam Foundation •Radical Middle Way •Socialist Workers Party •Sufi Muslim Council •The Cordoba Foundation •The English Defence League

The “expert wisdom gathering tool” also showed intent to scale or rank organizations in Germany, France, Europe generally, Nigeria, Niger, Senegal, Indonesia, Malaysia, Singapore and the Philippines, although only the UK and Indonesia sections contained active lists of groups.

Part II

The Pentagon’s multimillion dollar Minerva research program to fund social science research for military applications includes a flagship project established in 2009 at Arizona State University (ASU) to examine “radical” and “counter-radical” Muslim movements in Southeast Asia, West Africa and Western Europe.

The project’s “expert wisdom gathering tool,” used by academics involved in the project to assess and rank the threat-level from organizations and civil society groups, set its sights on the UK, Germany, France, Europe generally, Nigeria, Niger, Senegal, Indonesia, Malaysia, Singapore and the Philippines.

Although purportedly designed to assess Islamic movements, among the 36 UK organizations targeted for ranking on the tool’s “radicalization” scale are several non-Muslim activist groups critical of US, British and Israeli foreign policy. A deeper analysis of the criteria used by the project to label organizations discloses serious deficiencies that tend to cast suspicion of propensity for violence on any group calling for radical social, political or religious change.

Conflating Violent and Nonviolent “Radicalism”

Explaining the rationale behind the Minerva initiative, program director Dr. Erin Fitzgerald said, “Decreasing terrorism and political violence requires an understanding of the underlying forces that shape motivations and mobilize action. The vast majority of political movements – even only those with seemingly ‘radical’ political philosophies – do not turn violent or destabilize regional security; we want to understand what makes those leading to armed conflict different.”

This is why, she emphasised, the Pentagon is intent on studying both violent and nonviolent groups. “It is not enough to study only violent groups: just as diagnosing a disease requires a basic understanding of anatomy and physiology, understanding violent insurgencies requires a fundamental understanding of how mass movements develop broadly.”

Yet the research design of the ASU’s project examining radicalization discourses is deeply flawed, due to imprecise and incoherent definitions of political radicalism, violence and nonviolence. By having academic specialists grade the levels of radicalism of these organizations, the project attempted to fine-tune its algorithms to develop software that would automatically classify the organizations on a threat-scale with the same precision as those specialists.

“The QUIC based algorithm not only outperforms the baseline method, but it is also the only system that consistently performs at area expert-level accuracies for all scales,” the 2013 paper found

Prof. Sajjad Rizvi of the University of Exeter’s Institute for Arab and Islamic Studies was one of the independent academics asked to rank the UK organizations. He said that he was not informed of the project’s relationship to the US government or military, and thought the research design was poor.

“A colleague asked me to do it,” said Rizvi, “but my understanding was that it was to develop a tool for social science research. I had no idea about government links. I thought the questions and concepts did not make sense and I wasn’t at all convinced about the overall method.”

The project’s scaling system is extraordinarily imprecise – especially on the most critical definitions of political radicalism and violence. The paper begins by acknowledging difficulties in distinguishing “violent forms of political Islam and others deemed to be potentially violent,” and argues that presumed distinctions between “radical/moderate,” “modern/traditional,” and “conservative/progressive” are in fact very blurred.

According to the paper, the research aims “to understand features shared by violent religious movements and by those opposing them,” arguing that “binary labeling does not capture the overlaps, movement and interactivity among” these “radical” and “moderate” actors. Political radicalism is simplistically defined as “the ideological conviction that it is acceptable and sometimes obligatory to use violence to effect profound political, cultural and religious transformations and to change the existing social order fundamentally.”

Groups engaged in activism for social change are susceptible to being ranked higher on the radicalism threat-scale. The “change orientation” scale measures “the degree to which an entity wishes to effect social, political, and/or religious change” and “to which an individual or group attempts to influence others.”

The project’s definition of violence is also exceedingly broad, encompassing “more than killing, inflicting physical injury, and destruction of property” – that is, “symbolic and discursive violence” as they are “often steps leading toward physical violence.” The assumption is that the “manipulation of symbols and discourse is purposively articulated” by extremists “to provoke adversaries, demonize opponents, incite mobs to action, or to provide justifications for the necessity of violence.”

But this manipulation of discourse is not obvious: “Unlike physical violence that can be seen and clearly understood for what it is, symbolic and discursive violence are not necessarily self-evident.”

The “violence ideology” scale thus seeks to measure whether groups support or reject violence, but by definition qualifies that “some of the movements scaled rely on reasoned argumentation appealing to concepts of justice and oppression in addition to, or in place of narratives [justifying violence].” This therefore could include antiwar or civil liberties groups that recognize the legitimate right of occupied peoples to resist, or call for large-scale nonviolent civil disobedience.

While “pacifists who are ideologically committed to nonviolence” are at the bottom of the threat-scale, such organizations would likely end up higher up on the scale even with no history of supporting violence: “A lack of violent rhetoric is insufficient to classify an organization as pacifist if the organization is silent in the face of others’ violent acts and violent rhetoric.”

The scaling system for social, political and religious change is equally if not more ambiguous, effectively drawing suspicion on any social movement campaigning for change: “Social change refers to changes in social roles or in the social order. This can be thought of as group or society level changes and public changes such as changing the role of women. Political change refers to efforts to change the law or political structure. Religious change refers to the desire or effort to change religious practices.”

Similarly, the scale to determine “violence engagement” is vague enough to classify a wide range of groups as potentially supportive of violence: “At the extreme left end of this scale are entities which vocally and adamantly reject violence/embrace non-violence, and for whom violence is never acceptable, even when violence is perpetrated against them.” Thus, simply not advocating violence is insufficient: “Passive rejection of violence includes some tacit acceptance of violence.”

Individuals and civil society groups that recognize the legitimacy of self-defense by communities under attack, including resistance against military intervention or occupation to defend the right of self-determination, and those which offer no specific condemnation of violence, could find themselves ranked higher up the “radicalism” scale. Consequently, the vast majority of social and civic organizations could be labeled, using the scaling system’s own definition, as “entities which view violence as acceptable or justified.”

Prof. Rizvi passed on his concerns about the poor methodology to his colleague, Prof. Jonathan Githens-Mazer who was a co-author of the Pentagon-backed study of UK groups, but this did not affect the design or conduct of the research.

I asked another senior supervisor of the project, computer scientist Prof. Hasan Davulcu, whether he was aware that the tools his team were creating could augment the US intelligence community’s analytical capabilities to identify “radicals” in a way that might demonize peaceful political activists and, at worst, facilitate the targeting of terrorism suspects for extrajudicial assassinations.

“Our project aims to identify how Muslims resist and counter violent-extremists,” he said without explaining why non-Muslim and secular antiwar groups were included. The researchers hoped to make “a positive contribution that might one day help save lives of innocents.”

Prior to his ASU post, Prof. Davulcu was funded by the Pentagon’s Defense Logistics Agency for his work in industry.

The Pentagon’s implicit suspicion toward all political dissent is evident in another Minerva-funded research project led by Prof. Maria Rasmussen of the US Naval Postgraduate School, which I previously reported on for the Guardian. Rasmussen’s research summary explicitly conflates non-violent peaceful activists and NGOs as “supporters of political violence” who are different from terrorists solely by not embarking on “armed militancy” themselves. As they are defined as being “sympathetic to radical causes” and “sympathetic to the end goals of armed groups,” non-violent activists are thus placed squarely on the spectrum of violent extremism.

“Asking universities to mine social media data to identify radicals on behalf of the military is the worst of all possible combinations,” said Robert Steele, former veteran CIA official and founding deputy director of the US Marine Corps Intelligence Activity. A leading proponent of open source intelligence (OSINT), Steele is scathing about what he argues is an abuse of OSINT and how it should be undertaken.

“This program integrates academics dazzled by military funding who don’t understand culture, history, or foreign languages, with questionable technologies that are used by a tiny but powerful fraction of humanity with criminal ineptitude on the part of the government,” Steele said.

Domestic Extremists

In the same year that Prof. Githens-Mazer contributed to the ASU’s Pentagon-backed study of UK organizations to assess their support or opposition to “radicalism,” he delivered a September presentation with an active senior UK military official, Brigader Richard Stanford, chief of Joint Fires and Influence Branch (JFIB) at NATO’s Allied Rapid Reaction Corps (ARRC), at the Defence Information Superiority Conference hosted by Whitehall think tank the Royal United Services Institute (RUSI).

The slide presentation advocates a “Big Data” approach to gain a “cyber and information” advantage over the enemy to attack “the opponent’s will and cohesion.” But “large amounts of data” need to be “interpreted correctly with contextual awareness and cultural understanding” to produce “insightful results quickly.”

Githens-Mazer and Brigadier Stanford thus recommend the need for “SOCMINT” – social media intelligence – to “complement both SIGINT [signals intelligence] and HUMINT [human intelligence].” Social media must also be used to “communicate our message” and influence “perception and public opinion,” in addition to “horizon scanning” to preempt disorder, crime or conflict by drawing on the input of “regional and thematic experts” in analysing Big Data – particularly across complex open-ended areas like “public health,” “climate change,” and “urban environments.”

This approach is precisely what was being pursued by the DoD’s Minerva initiative to which Prof. Githens-Mazer contributed. He could not be reached for comment.

Just three months earlier, Githens-Mazer’s co-presenter Brigadier Stanford was directing NATO exercises in Corsica, an island south of France, to prepare for “any possible future conflict,” according to a local newspaper, the Gloucestershire Echo. Elite troops from 16 European countries, 60 per cent from the UK including the county of Gloucestershire, participated in the military exercises “which were set in the fictional country of Tytan, which has been facing strikes and increasing social unrest. Soldiers had to work from the nearby Corsica peninsula, after a lockdown by the Tytan government.”

“Corsica has given us a good awareness of potential challenges we could find on a real, operational deployment,” said exercise participant Lieutenant Colonel Dag Bjornerud, highlighting that NATO’s planning for future warfare increasingly sees domestic civil unrest in the west as a primary threat. NATO’s enemy, in other words, includes discontented European populations calling for social change, and engaged in widespread social protest.

That month, Wired reported how since the 2011 London riots, the Metropolitan Police’s National Domestic Extremism Unit (NDEU) were applying SOCMINT methods, collecting public “tweets, YouTube videos, Facebook profiles, and anything else UK citizens post in the public online sphere,” which would then be subjected to the same sort of analytical tools being developed with Pentagon-funding at ASU. Some 9,000 Britons, many from political groups, are being tracked as “domestic extremists” by the NDEU.

Part III

The US Department of Defense’s multi-million dollar university research program, the Minerva Research Initiative, is developing new data mining and analysis tools for the US military intelligence community to capture and analyze social media posts. The new tools provide unprecedented techniques to identify individuals engaged in political radicalism around the world, while mapping their behavioral patterns and social or organizational connections and affiliations.

The range of research projects undertaken by Arizona State University (ASU), a National Security Agency (NSA)-designated university, includes the development of algorithms which leading intelligence experts agree could directly input into the notorious “kill lists” – enhancing the intelligence community’s ability to identify groups suspected of terrorist activity for potential targeting via the CIA’s extrajudicial “signature” drone strikes.

Through the Social Media Looking Glass

One Pentagon-sponsored ASU project whose findings were published by the Social Network Analysis and Mining journal in 2012 involved downloading and cataloging 37,000 articles from 2005 to 2011 from the websites of 23 Indonesian religious organizations to “profile their ideology and activity patterns along a hypothesized radical/counter-radical scale.”

This study also found that the automated threat-classification model successfully ranked the organizations with “expert-level accuracy.” Related research has focused on developing data-mining and analytical tools to track political trends and social movements via social media.

According to Minerva chief Erin Fitzgerald, such research is about minimizing conflict. “Insights generated from Minerva research are intended to inform more effective strategic and operational policy decisions by defense leadership,” said Fitzgerald. “The end goal is always to prevent future conflict and – if the US must play a role in conflict elsewhere – to help DoD understand how to most effectively engage with partners to mitigate that conflict.”

Long before Edward Snowden’s revelations about NSA surveillance programs, it has been known that the CIA and other US intelligence agencies have actively sought to analyze social media, from blogposts to tweets and from Amazon reviews to YouTube clips and Flickr photos. The NSA’s Open Source Indicators Program, for instance, involves “academics who work at a research branch of the NSA” developing automated analytical tools that mine open source information on Facebook, Twitter, Google and elsewhere to predict future events such as protests, pandemics, resource shortages, mass migrations, and economic crises.

Such open source information is also already being used as “enrichment” data to be integrated with phone and email metadata to create sophisticated graphs identifying the social connections, associates, locations, traveling companions and other patterns of behavior of individuals seen as potential “radicals” or terrorists – whether American or foreign.

This analytical capability is directly mobilized to identify not just potential extremists and their associations, but also to pursue and sanction “high-value targets.” In one case in 2012, according to the Washington Post, “a user account on a social media Web site provided an instant portal to an al-Qaeda operative’s hard drive.” A leaked NSA document confirmed: “Within minutes, we successfully exploited the target.”

The National Counterterrorism Center (NCTC), which generates the CIA’s kill lists, draws its information from databases across the US intelligence community, including the FBI, CIA, NSA and Department of Homeland Security (DHS), among others, encompassing and integrating both metadata from private electronic communications and associated data across an individual’s online social media networks.

DHS fusion centers, working closely with private sector corporations, routinely data-mine social media posts of American citizens including, for instance, Occupy activists in efforts to detect threat trends that could constitute a “hazard” to the public.

According to a classified criteria, names of potential “radicals” outside the US linked to terrorist groups or activity are canvassed by the NCTC for potential extrajudicial assassination, and assessed by a White House interagency commission, which narrows them down before a presidential decision as to whether each individual lives or dies.

The problem, according to ASU researchers writing in a separate 2013 paper, is that current technology “cannot find the proverbial ‘needles in a haystack’ corresponding to those individuals with radical or extremist ideas, connect the dots to identify their relationships, and their socio-cultural, political, economic drivers.”

This has led the Pentagon to fund the ASU to create a technology dubbed “LookingGlass,” “a visual intelligence platform for tracking the diffusion of online social movements.” Algorithms are applied to “large amounts of text collected from a wide variety of organizations’ media outlets to discover their hotly debated topics, and their discriminative perspectives voiced by opposing camps organized into multiple scales.”

These are then used to “classify and map individual Tweeter’s message content to social movements based on the perspectives expressed in their weekly tweets.” The LookingGlass platform is able “to track the geographical footprint, shifting positions and flows of individuals, topics and perspectives between groups.”

Unlike previous systems, the Pentagon’s appropriation of LookingGlass can provide “real-time contextual analysis of complex socio-political situations that are rife with volatility and uncertainty. It is able to rapidly recognize radical hot-spots of networks, narratives and activities, and their socio-cultural economic, political drivers,” and is able to identify and track specific “radical” and “non-radical” individuals, along with shifts in their beliefs and affiliations to “radical” and “non-radical” movements and organizations.

The Entire Physical and Virtual World is a Militarized Battlefield

US intelligence experts assessing the Minerva initiative strongly disagreed with Fitzgerald’s claims that these Pentagon-funded research projects would contribute to minimizing conflict.

In my earlier report on Minerva, I disclosed an internal Minerva staff email related directly to the ASU’s radicalization discourse project, which confirmed that the program is geared toward producing “capabilities that are deliverable quickly” for application to field operations. Senior Pentagon officials had told ASU staff to develop “models and tools that can be integrated with operations.”

The analytical tools developed for the Pentagon by ASU researchers are directly applicable to the extensive data-mining programs of the National Security Agency revealed by whistleblowers Edward Snowden, Russell Tice, William Binney and Thomas Drake, among others. Billions of pieces of data in the form of phone calls, emails, photos and videos from major communication giants like Google, Facebook, Twitter and Microsoft, among others, are collected and then analyzed to identify national security threats.

Technologies like LookingGlass could dramatically advance the NSA’s capacity to track and analyse metadata in the context of the “open source” online data it mines so comprehensively. Former senior NSA executive and whistleblower Thomas Drake told me regarding the Minerva-funded data-mining projects, “We must remember that the entire world, physical and virtual, is considered by the Pentagon as fair game and a militarized battlefield.”

LookingGlass, along with the other data-mining tools being developed by universities with Pentagon funding, fits neatly into the parameters of earlier intelligence structures such as the Pentagon’s “Total Information Awareness” (TIA) program launched by the Bush administration, described by the New York Times as “the most sweeping effort to monitor the activity of Americans since the 1960’s.” TIA’s function was to use data-mining “to create risk profiles for millions of visitors and American citizens in its quest for suspicious patterns of behavior.”

Under Obama, this has evolved into the global “disposition matrix,” a “next-generation targeting list” which “contains the names of terrorism suspects arrayed against an accounting of the resources being marshaled” to kill them, including the ability to map “plans for the ‘disposition’ of suspects beyond the reach of American drones.” The kill lists, reported the Washington Post, are part of “a single, continually evolving database in which biographies, locations, known associates and affiliated organizations are all catalogued. So are strategies for taking targets down, including extradition requests, capture operations and drone patrols.”

It is therefore no coincidence that Lisa Troy – the Pentagon’s Minerva supervisor for the University of Washington’s project, aiming to “fingerprint” the configuration of “mass political movements” to gauge the determinants of “social change” – is an employee of Bowman Systems Management which is a leading US defense contractor working on drone warfare technology.

Yet as Harvard security technologist Prof. Bruce Schneier has pointed out, the inherently murky and fluid categories used to profile terrorists and potential terrorists mean that the risk of seeing terrorists where there are none is higher the more data is inputted into the data-mining system.

“Depending on how you ‘tune’ your detection algorithms, you can err on one side or the other,” Schneier writes. “You can increase the number of false positives to ensure that you are less likely to miss an actual terrorist plot, or you can reduce the number of false positives at the expense of missing terrorist plots. To reduce both those numbers, you need a well-defined profile.”

The problem is that for terrorism, he writes, “There is no well-defined profile, and attacks are very rare. Taken together, these facts mean that data mining systems won’t uncover any terrorist plots until they are very accurate, and that even very accurate systems will be so flooded with false alarms that they will be useless.”

This is why the NSA’s eavesdropping program “spat out thousands of tips per month,” said Schneier, and “every one of them turned out to be a false alarm.” Although “useless for finding terrorists,” the NSA’s data-mining is “useful for monitoring political opposition and stymieing the activities of those who do not believe the government’s propaganda.”

“What this is really about is furthering the militarization of domestic law enforcement and going further down the rabbit hole into the belief that it is possible to control the 99% if you just have enough surveillance and enough armed force,” said ex-CIA official Robert Steele, speaking about ASU’s LookingGlass platform.

Indeed, although the NCTC’s criteria for generating kill lists remains secret, a leaked NCTC document obtained by The Intercept in July revealed that the criteria to add individuals to watch lists of “known or suspected terrorists” in the first instance was vague, requiring only “reasonable suspicion,” not “concrete facts.” Supporting evidence for doing so could be as thin as a single, uncorroborated social media post – which perhaps explains why 40% of suspects on the main watch list are not linked to any “recognized terrorist group.”

Extrajudicial assassinations known as “signature” strikes, targeting groups of terrorism suspects whose identities are not known, reportedly have far more fluid criteria in determining whether they are plotting or engaged in terrorist activity – so fluid that State Department officials complained to the White House that the CIA’s criteria is “too lax.” They joked that if the CIA sees “three people doing jumping jacks” in a hostile territory, it would be designated a terrorist training camp. Hence, drone warfare in Pakistan and Yemen has frequently targeted “terrorist suspects” about whom the CIA “were not certain beforehand of their presence” and “whose names they do not know.”

“The algorithms being developed at ASU remind me of the algorithms used as the basis for signature strikes with drones,” said former senior NSA executive Thomas Drake. In 2006, Drake leaked information about the NSA’s data-mining project Trailblazer to the press. Although the US government attempted to prosecute him under the Espionage Act in 2010, the case collapsed.

I asked Drake whether the ASU’s algorithms could be applied to fine-tuning the generation of “kill lists” for drone strikes. “Your hunch is right,” he said. “Having the US government and Department of Defense fund this kind of research at the university level will bias the results by default. This is a fall-out of big data research of this type, using algorithms to detect patterns when the patterns themselves are an effect – and mixing up correlation with causality. Under this flawed approach, many false positives are possible and these results can create an ends of profiling justifying the means of data-mining.”

Part IV

Since 2008, the year of the worst financial crisis since the Great Depression of the 1930s, the US Department of Defense has funded a multimillion dollar university research program to probe the complex dynamics of mass social and political movements, anticipate global trends, and ultimately augment the intelligence community’s preparations for civil unrest and insurgencies both abroad and at home. Part of that has involved developing advanced new data mining and analysis tools for the US military intelligence community to pinpoint imminent and potential threats from individuals and groups.

Among its many areas of focus are ongoing projects at Arizona State University (ASU) designed to enhance and automate the algorithms used by intelligence agencies like the NSA to analyze “open source” information from social media in order to track the potential threat-level to US interests. Formal organizations and broad social networks as well as individuals could be identified and closely monitored with such tools to an unprecedented degree of precision.

Loosely defined concepts of political “radicalism,” violence and nonviolence, as well as questionable research methodologies, open the way for widespread suspicion of even peaceful activist groups and their members, and the equation of them with potential terrorists. Civil society organizations in the U.K., including both Muslim religious groups and non-religious anti-war networks, have been prioritized for study to test and improve the effectiveness of these data-mining tools.

Increasingly, though, the automation of threat-detection and terrorism-classification has been accompanied by the automation of killing, in the form of the generation of “kill lists” of terrorism suspects to be targeted via extrajudicial assassination by drone strikes. As President Obama, encouraged by powerful lobbies in the defense industries, has paved the way for the systematic integration of drones into domestic law-enforcement and homeland security operations, the prospect of extrajudicial assassinations occurring on US soil are no longer merely hypothetical.

Now, new but little-known Pentagon directives authorize the use of armed drones against American citizens in the homeland in the context of domestic emergencies.

Algorithms of Death

“The algorithms being developed at ASU remind me of the algorithms used as the basis for signature strikes with drones,” said Thomas Drake, a former senior National Security Agency executive who leaked information about the NSA’s data-mining project Trailblazer to the press in 2006.

Drake agreed that the algorithms linked to “LookingGlass,” a new Pentagon-sponsored visual intelligence platform, could in fact be applied to fine-tuning the generation of the CIA’s notorious “kill lists.”

“Having the US government and Department of Defense fund this kind of research at the university level will bias the results by default. This is a fall-out of big data research of this type, using algorithms to detect patterns when the patterns themselves are an effect – and mixing up correlation with causality. Under this flawed approach, many false positives are possible and these results can create an ends of profiling justifying the means of data-mining.”

It is now increasingly recognized that US drone strikes against foreign terrorism targets have systematically killed large numbers of civilians, with a 2012 joint Stanford and New York University report suggesting that as few as 2% of casualties are “high-level” targets – an analysis cohering with counterinsurgency expert David Kilcullen’s 2009 estimate showing a “kill ratio” of 50 civilians to one militant, or, in other words, 98% civilian casualties.

“My colleagues in Special Forces tell me that the men on the front line are furious with the lack of accuracy and integrity at the national level, and no longer trust the targeting data,” said former veteran CIA case officer Robert Steele, who previously served as a Marine Corps infantry officer.

“They have seen for themselves how wrong the system is when they look their man in the eyes. Technical surveillance is the most expensive, least useful, and least accurate form of surveillance. Technology is not a substitute for thinking. We must become deeply and broadly expert at the human factor.”

Drones Come Home

US administration officials including Obama himself have repeatedly refused to confirm whether the alleged legal power to conduct extrajudicial assassinations via drone strikes extends to the US homeland. Last year, prior to becoming CIA director, John Brennan told the Senate Intelligence Committee: “…we do not view our authority to use military force against al-Qaeda and associated forces as being limited to ‘hot’ battlefields like Afghanistan.” He referred to Attorney General Eric Holder’s statement that “neither Congress nor our federal courts has limited the geographic scope of our ability to use force to the current conflict in Afghanistan.”

In February 2012, Obama signed in a law directing the Federal Aviation Administration (FAA) to throw American airspace wide open to drones by as early as September 2015. US Customs and Border Protection (CBP) already deploys Predator drones to spot smugglers and illegal immigrants crossing into US territory, and two dozen US police departments have successfully applied for FAA permits for drones. As National Geographic observes, “all 18,000 law enforcement agencies in the US are potential customers.” By 2020, it is estimated that some 30,000 drones would be active across the US homeland.

Documents obtained under Freedom of Information by the Electronic Frontier Foundation (EFF) show that police plan to use drones essentially for surveillance. In Seattle and Miami, drones are already being used during criminal investigations and in “hot pursuit” of suspects, and could be used during natural disasters along with “specific situations with the direct authorization of the Assistant Chief of the Homeland Security Bureau.” Hundreds of “domestic drone missions” have been flown by CBP on behalf of other state and local agencies.

Last year, government documents revealed that Department of Homeland Security had customized its Predator B drones, built originally for foreign military operations, for domestic surveillance tasks and to “respond to emergency missions across the country,” including “identifying civilians carrying guns and tracking their cell phones.”

These drones are now being used on US soil by the FBI, Secret Service, Texas Rangers and some local police forces. The DHS had also proposed to arm its domestic fleet of border patrol drones with “non-lethal weapons designed to immobilize TOIs [targets of interest]” – an option also being pursued by local police agencies that want to arm drones with rubber bullets, tear gas and other riot control weapons.

According to an unclassified US Air Force document, the deployment of military drones in US airspace will be controlled by the Pentagon and will be able to monitor unidentified groups, as well as “specifically identified” individuals with the Secretary of Defense’s approval. Military drones “are allowed to fly drones in public areas and record information on domestic situations,” noted Jennifer Lynch of the Electronic Frontier Foundation.

Executive Decisions

In February 2013, an extraordinary Pentagon directive authorized the deployment of US military resources and personnel to respond to domestic emergencies, quell civil unrest and support civilian law enforcement in a domestic terrorism incident. The new directive builds on an earlier 2010/2012 DoD directive specifically authorizing the use of military surveillance drones on US soil under Pentagon authority.

Although that directive prohibited the use of “armed” drones for “DSCA [Domestic Support to Civil Authorities] operations,” the new 2013 directive for Domestic Support to Civil Law-Enforcement Agencies goes further. It broadly asserts that “the Secretary of Defense may authorize the use of DoD personnel in support of civilian law enforcement officials during a domestic terrorism incident.”

Unlike the older directive, it stipulates that US military commanders, including those at USNORTHCOM, USPACOM, and USSOCOM, would receive blanket authority over “operations, including the employment of armed Federal military forces at the scene of any domestic terrorist incident.” No limit is specified on what kind of “armed military forces” the Pentagon can conceivably deploy.

The “hypothetical” but nevertheless real extension of powers here was confirmed when Republican Senator Rand Paul asked Attorney General Holder to confirm the Obama administration’s position on conducting armed drone strikes on US soil.

Holder wrote back that “the president could conceivably have no choice but to authorize the military to use such force if necessary to protect the homeland in the circumstances like a catastrophic attack.”

While denying any specific “intention” to do so, Holder conceded “it is possible, I suppose, to imagine an extraordinary circumstance, in which it would be necessary and appropriate under the constitution and applicable laws of the United States for the president to authorize the military to use lethal force within the territory of the United States.”

Although Holder’s comments were widely publicized last year, their pseudo-legal parallel in the form of the Pentagon’s 2013 directive was not. The latter demonstrates that Holder’s consideration of the US military’s legal authority to execute drone strikes on US soil is far from “hypothetical.” On the contrary, the US military was determined to ensure that this extraordinary authority was formally adopted.

I asked the US Department of Defense whether it could confirm that the Minerva-funded data-mining research would not be used to support the US intelligence community’s analytical tools to identify terrorism suspects, in particular to identify targets for extrajudicial assassination. I did not receive a direct answer to this question.

“Research in these areas will improve strategic and operational responses to insurgencies,” said Dr. Erin Fizgerald, chief of the Minerva program. “Perhaps more importantly, these efforts will help analysts faced with a particular political environment that seems ripe for mass mobilization – or a particular movement that appears to be turning violent or destabilizing a government – know where to look to understand a particular movement and its implications for society.”

Global Instability

Prof. Mark Woodward, an anthropologist who leads the ASU projects funded by the DoD’s Minerva Research Initiative, is also affiliated to the CIA-funded Political Instability Task Force (PITF), originally formed in 1994 by appointment of the US government. Although the PITF boasts of developing a predictive model with a “two-year lead time and over 80% accuracy” based purely on modelling “political institutions, and not economic conditions, demography, or geography,” in practice US intelligence was unable to anticipate the unprecedented wave of instability that has swept across the Middle East and North Africa since 2011.

The Pentagon Minerva program addresses this gap in attempting to account for a complex range of interconnected factors beyond political institutions, including the impacts of environmental, energy and economic crises.

As I reported last year, the NSA’s surveillance programs are linked to extensive Pentagon planning for civil unrest in the context of escalating risks from climate, oil, food and economic shocks. Official documents over the last decade confirm that the intelligence community anticipates a heightened threat of instability, including “domestic insurgencies,” due to social and political collapse triggered by such shocks.

As episodes like the recent conflagration in Ferguson demonstrate, the Pentagon’s fears of a future of imminent domestic civil unrest are already being borne out.

Join us in defending the truth before it’s too late

The future of independent journalism is uncertain, and the consequences of losing it are too grave to ignore. To ensure Truthout remains safe, strong, and free, we need to raise $31,000 in the next 48 hours. Every dollar raised goes directly toward the costs of producing news you can trust.

Please give what you can — because by supporting us with a tax-deductible donation, you’re not just preserving a source of news, you’re helping to safeguard what’s left of our democracy.