Joshua Lott / Reuters Attendees are seen on a black and white FLIR high definition camera monitor displayed at the 7th annual Border Security Expo in Phoenix, Arizona March 12, 2013.

The Violence of Algorithms

Why Big Data Is Only as Smart as Those Who Generate It

In December 2010, I attended a training session for an intelligence analytics software program called Palantir. Co-founded by Peter Thiel, a techno-libertarian Silicon Valley billionaire, Palantir is a slick tool kit of data visualization and analytics capabilities marketed to and widely used by the NSA, the FBI, the CIA, and other U.S. national security and policing institutions.

The training session took place in Tyson's Corner, in Washington, D.C., at a Google-esque office space complete with scooters, a foosball table, and a kitchen stocked with energy drinks. I was taking the course to explore the potential uses of the tool for academic research.

The dashboard for the New York Police Department's 'Domain Awareness System' (DAS) is seen in New York May 29, 2013.

The dashboard for the New York Police Department's 'Domain Awareness System' (DAS) is seen in New York May 29, 2013.

We spent the day conducting a demonstration investigation. We were first given a range of data sets and, one by one, we uploaded them into Palantir. Each data set showed us a new analytic capability of the program: thousands of daily intelligence reports were disaggregated to their core pieces of information and correlated with historical data; satellite images were overlaid with socio-economic, air strike, and IED data. And in this process, the promise of Palantir was revealed: with more data comes greater clarity. For analysts who spend their days struggling to interpret vast streams of data, the Palantir demo was an easy sell.

In our final exercise, we added surveillance data detailing the planned movements of a suspected insurgent. Palantir correlated the location and time of these movements with the planned movements of a known bomb maker. And there the training ended. It was quite obvious that the next step, in “real life,” would be violent. The United States would send in a drone or Special Forces team. We in the demo, on the other hand, just went home.

This program raises many challenging questions. Much of the data used was inputted and tagged by humans, meaning that it was chock full of human bias and errors. The algorithms on which the system is built are themselves coded by humans, so they too are subjective. Perhaps most consequentially,

Browse Related Articles on {{search_model.selectedTerm.name}}

{{indexVM.results.hits.total | number}} Articles Found

  • {{bucket.key_as_string}}