Lifting a page from Philip K. Dick’s book, the Department of Homeland Security is in the early stages of developing something called FAST that, when it’s installed at an airport near you, will attempt to catch terrorists before they strike. Maybe.
FAST stands for Future Attribute Screening Technology, and the program will, according to The Atlantic, “remotely monitor physiological and behavioral cues, like elevated heart rate, eye movement, body temperature, facial patterns, and body language, and analyze these cues algorithmically for statistical aberrance in an attempt to identify people with nefarious intentions.”
The Atlantic points out two major problems with FAST. One, the research and development of this tech is being carried out in a laboratory setting, with volunteers being brought in and told to carry out a disruptive act. FAST is currently able to pick out the “terrorists” with a 70 percent accuracy rate. Problem is, these aren’t actually terrorists, they’re people pretending to be terrorists—real terrorists might well betray different physiological signs.
Then comes the false-positive paradox:
“Let’s assume for a moment that 1 in 1,000,000 people is a terrorist about to commit a crime. Terrorists are actually probably much much more rare, or we would have a whole lot more acts of terrorism, given the daily throughput of the global transportation system. Now let’s imagine the FAST algorithm correctly classifies 99.99 percent of observations—an incredibly high rate of accuracy for any big data-based predictive model. Even with this unbelievable level of accuracy, the system would still falsely accuse 99 people of being terrorists for every one terrorist it finds. Given that none of these people would have actually committed a terrorist act yet, distinguishing the innocent false positives from the guilty might be a non-trivial, and invasive task.”
Given that FAST is, as noted, currently operating at 70 percent accuracy in a lab setting, the potential for false positives is too great to ignore. FAST seems like a Shiny New Thing that will suck up lots of money and resources before someone realizes that, perhaps, the money and resources could be better spent some other way.