The A.I. Thought Police
Julie Beal, Contributor
Sunday, September 23, 2012
We know we’re being surveilled in the matrix: AI Law, empowered by algorithms, feasts on Twittered hate crimes and the like to try to predict crime. We are leaving virtual trails of data which are used to feed simulation models, for predictive analytics – but we can still opt out, throw away our phones, disconnect. There is still some control over what they take from us. But AI Thought Police wants more – to climb into our minds, understand our physical make-up, really get to know us.
We are all under suspicion, but AI needs to know which ones to focus on. So the US army is developing methods to covertly identify and track people who plan to do ‘something bad’. Hidden sensors will be used to detect AI’s version of ‘adversarial intent’ by reading and cataloguing our emotions and health.
A report called ‘Remote Detection of Covert Tactical Adversarial Intent of Individuals in Asymmetric Operations’ was authored by the US Army Research Laboratory in 2010; it details the requirements of researchers wishing to gain funding from the US Federal Government by developing techniques to hone in on individuals in crowds, to detect antagonistic attitudes among the “clutter” of innocents. The prime directive to protect national security, counter ‘insurgency’, and generally ‘keep the peace’, however, means the technology that is developed will spread beyond airports and be used for wider civilian applications, such as “crowd control and in antidrug, anticrime, and immigration enforcement.” In fact, applications in the civilian economy are said to be plentiful, and also include “border security, and ensuring the security of government and private personnel and property”.
The report points out that fusion of information of different types is getting better, such as combining data obtained with a laser with photographic or video images. This technique can help reveal “seemingly hidden patterns”; but the army wants the data to be detected from at least 3m away, and preferably up to 50m away, for use in “asymmetric defense scenarios”.