Opinion Politics

Growing National Security Threat From Growing National Security Programs

Lwp Kommunikáció, flickr; Hustvedt, wikimedia commons
Written by Vera Wilde

Over ten years ago, researchers representing the National Academy of Sciences told Congress why polygraph screening programs jeopardize the national security they’re intended to advance. Congress failed to act on that information, and programs like this — mass security screenings for low-prevalence problems, such as polygraph screening, insider threat, dragnet telecommunications surveillance, and “stop and frisk” policing programs — have since expanded. A lot.

In 2003, representatives of the National Academy of Sciences testified before Congress about polygraph screening programs. They applied Bayes’ Rule, a mathematical theorem about how subgroups behave differently, to show the programs hurt national security. But the programs, along with others like them, have since expanded. And the Academy scientists — affected by 9/11 during their research and writing of the report — left out six implications of their application of Bayes’.

The implications suggest that polygraphs really hurt security. And the logic applies not just to federal polygraph screening programs, but to all mass security screenings for low-prevalence problems. Mass surveillance endangers national security.

Bayes’ Rule says weird people are weird. Subgroups that are already different, behave differently — in ways that often matter a lot in health and security especially. Pregnant women might respond differently to a drug than elderly men. Spies or terrorists might respond differently to a security screening than you and me. According to the Academy, in the data they made up, that looks like this:

Truth
Polygraph interpretation Non-spy Spy Total
Fail 1,598 8 1,606
Pass 8,392 2 8,394
9,990 10 10,000

Table 1: Results of polygraph interpretation assuming better than state-of-the-art accuracy in a hypothetical population of 10,000 people including 10 spies. Modified from National Research Council 2003.

And I mean, they really made it up. The Congressionally commissioned National Academy scientists kept asking for data from the security agencies that use polygraphs — like me, in FOIA lawsuits for my dissertation research — and not getting it. So, like me, they had to look at the published, peer-reviewed scientific literature. And like me, they decided to be really generous with the people running polygraph screening programs in the name of national security. They pretended lie detection was 80% accurate, when that’s better than the science says the best possible accuracy rate is. And they pretended there were 10 spies in a pretend sample of 10,000 National Lab employees — a much higher base rate (of occurrence) than we would expect to see in a population of public servants, unless you put occasionally having a drink with a journalist on the same level as giving nuclear secrets to the (un)Islamic State.

Unpacking this table highlights how information, fairness, and attention problems cause mass security screenings for low-prevalence problems to harm security.

The Sieve Problem

Think of a pasta colander. The holes in the sieve have to be tight to catch orzo or angel hair. But the tighter you make the sieve, the less water gets out. So pasta colanders have looser mesh designs than sieves designed for getting chicken bones out of soup.

RELATED:  Arundhati Roy, John Cusack, Daniel Ellsberg & The Exiled Whistle-Blower

Health and security screenings work the same way. You can set screenings to catch more chicken bones — or try to identify more spies. But setting the screening (sieve) to be tougher (tighter) will ding more innocent people, too. Then you will spend a lot of resources trying to sort out the limp noodles (true spies) from the water in the way (innocent people). This is an information problem because it’s about imperfect identification of private information.

According to the Academy, polygraphs identify deception at rates above chance but well below perfection. This information problem has another side — the fact that spies are real, live human beings — that changes the conditional probabilities of identifying spies versus non-spies in a way the Academy didn’t quite spell out. It makes polygraph programs more vulnerable, the more accurately they identify a signal in the physiological noise. So the more polygraph proponents argue their “test” is close to 100% accurate, the more vulnerable it will be to countermeasures because…

The Spies Haz Internets Problem

Widely available information about how to manipulate polygraph results means they’re unlikely to identify dedicated security threats. You can arrest people for talking about it in the wrong way, but you can’t get food coloring out of the ocean or information off the Internet. So polygraph screening program efficacy given spies with Internet is probably less like 80% and more like 0% — if polygraphs are as accurate as their proponents claim.

The Noble Lie Problem

But what if we just devised a really good lie? A noble one? “Hey, I know what you’re thinking and feeling — so just come clean and everything will be ok.”

Social scientists liked that idea a lot when they thought of it decades ago. They called it a bogus pipeline. Some people like the idea of using polygraphs as interrogation props in this way. It’s a long-running trope on cop shows and in comic strips too, with stupid criminals confessing after police hook them up to copy machines: “Beep!” “It says you’re lying.” “Omg, I am lying!!!”

So maybe you can catch really stupid spies this way after all. Especially if you come up with a new bogus pipeline — something better than the polygraph. But only if you use it in investigations with small samples instead of screening programs with large ones. Otherwise, Bayes’s Rule shows you’re still literally wasting your time — spending limited resources going after needles in haystacks while spies have already fallen through the cracks.

But under recent legal precedent, if I say that in the wrong way to the wrong person, I can get thrown in jail. That’s a security problem in itself. It has two sides.

RELATED:  Jihad for Jesus: on Jay Sekulow’s Lawlessness

The American Problem

Government of the people, by the people, and for the people has limited powers. Its powers are limited to protecting life, liberty, and the pursuit of happiness. This is the spirit of the laws of the United States of America. When governmental officials or organizations — at a high or low level, on an agency or government-wide basis — degrade this spirit, the system is designed to check that abuse of power, and balance it out. Which is good, because without that continuing check and balance, limitless government power implodes the very values it’s meant to protect.

Security doesn’t outweigh liberty as a value in a tug-of-war. It depends on liberty, because security implies protection. Protection implies a protected thing. The protected thing in the U.S. is liberty. And when people don’t perceive the government is protecting liberty, the second fairness problem arises.

The Fairness Perceptions Problem

Procedural justice research shows what the popular kids in high school who were secretly as scared and lonely as the rest of us already knew — perception is reality. People who believe that other people like them are more likely to act happy and confident, making other people feel liked, happy, and confident, making other people feel… And thus creating a trust spiral.

Similarly, people who believe they have rule of law are more likely to behave in ways that contribute to rule of law — sharing information with authorities and following rules. Belief in rule of law helps make rule of law. Trust makes security. Perceived unfairnesss degrades it.

Some of the documents my FOIA lawyer Kel McClanahan, Executive Director of National Security Counselors, got released for my dissertation research from several federal agencies — after years of requests and lawsuits — show a lot of reasons to suspect polygraph screening programs create perceived unfairness. Lack of oversight is endemic. Polygraphers themselves complained about the misuse of polygraph tests on torture victims in the Middle East. More than one sexual assault survivor reported polygraph questioning about her assault in connection with her security clearance. And before my lawyer got those documents released — that I in turn released to McClatchy reporter Marisa Taylor for her national investigative series on polygraphs — I also interviewed some people who were exonerated from death row after failing polygraphs, who lost security clearances from failed polygraphs after raising ethical concerns with security agency operations, and who said their polygraph interrogations had touched on personal stuff like religious beliefs.

Security policies that degrade trust probably degrade security according to procedural justice research. Polygraph screening programs are examples of such policies. But they’re really just one form of mass security screenings for low-prevalence problems that do this. And they have another big problem with two facets:

RELATED:  Take the World to Tiffany's: America Must Show the World We Stand for Our Values

The attention-to problem. Paying attention to large numbers of false positives (innocent people dinged by mass security screenings) distracts from looking for false negatives (guilty people not so dinged).

The attention-from problem. Paying attention to using old tools that don’t work that well distracts attention from developing new tools that might work better.

All other resources — money, machines, manpower — might seem effectively limitless for the U.S. Government. But they’re not. Manpower in particular is limited, because people’s attention is a scarce resource.

Mass surveillance endangers national security. Click to Tweet

Exploding Security Programs

Mass security screenings for low-prevalence problems hurt security according to the fuller implications of Bayes’ Rule. This means that federal polygraph screening programs, insider threat programs, mass surveillance, and stop and frisk policies should stop, because they backfire. When National Academy scientists explained the basic logic here to Congress in 2003, the Congresspersons present were alarmed, and pressured the feds to cut polygraph programs. But since then the feds have massively expanded polygraph and other mass security screenings for low-prevalence problems.

This is how massive those expansions have been: recruit and employee security screening polygraphs rose nearly 750% in the FBI from 2002 to 2005. Polygraph programs are required now as part of U.S. sponsored anti-corruption programs such as Plan Colombia, the Mérida Initiative in Mexico, and others in the Bahamas, Bolivia, Guatemala, Honduras, and Iraq. Meanwhile, similar “credibility assessment” systems extend polygraph screening tools wirelessly to transportation and border security screening contexts. These programs include the Department of Homeland Security’s Transportation Security Administration’s SPOT (Screening of Passengers by Observation Techniques), FAST (Future Attribute Screening Technologies), and AVATAR (the Automated Virtual Agent for Truth Assessments in Real-time).

“Deception detection” is a $3-4 billion/year domestic industry. That doesn’t touch mass surveillance or stop and frisk. Mass security screenings for low-prevalence problems probably have an annual taxpayer pricetag an order of magnitude larger.

Mass security screenings for low-prevalence problems threaten national security according to Bayes’ Rule. Congress should hold hearings to address the problem. Such hearings might free up trillions in federal funding for other programs. They might open the door for the national truth and reconciliation commission some suggest is integral for healing the wounds of violence and prejudice in the era of Guantanamo and Ferguson.

And we know from open-source documents that the CIA lied to Congress about at least one aspect of these programs — their relationship to equal opportunity law. That alone should have ticked off a Congressional inquiry in 2012. But that’s the next part of this series.

 

About the author

Vera Wilde

Reformed Harvard Kennedy Fellow, wondering artist, wandering artist. www.wildethinks.com

Leave a Reply

Be the First to Comment!

Notify of
wpDiscuz

Growing National Security Threat From Growing National Security Programs

by Vera Wilde
0
Advertisment ad adsense adlogger
Read previous post:
True to His Word, Larry Lessig Joins Presidential Race

The sixth Democrat in the race, Larry Lessig is better known for his work on the Creative Commons license, the...

Close