On September 11, 2001 — fourteen years ago today — an al-Qaeda terror attack hit the U.S. homeland, killing 2,996, injuring thousands more, and terrorizing millions. At the time, Congress had recently commissioned the National Academy of Sciences to summarize the scientific evidence on polygraphs (“lie detectors”) before the Department of Energy’s proposed, mass expansion of polygraph screening for spies in National Labs in the wake of the Hen Ho Lee scandal. The national trauma shaped the committee’s work — arguably contributing to the systematic cycle of self-destructive behavior into which we have since fallen, with a pattern of mass security screening program expansions that threatens instead of advancing our national security.
Committee co-chair Stephen E. Fienberg reflected: “9/11 happened in the middle of our deliberations. And that had actually a very substantial impact on the committee… At work, we had very serious scientists get even more engaged in what the implications of science were in national security because national security was now on the table. Now in a different way. We couldn’t get into this building in the way we used to,” he said from the National Academy Keck Center in Washington, DC. The nation was under threat, and the committee lived that same sense of threat.
Threat degrades cognition in some ways even while it can make us sharper in others. This might explain why National Academy scientists and Congress missed the full implications of the Academy’s application of Bayes’ Rule to mass security screenings like polygraphs. The scientists were among the nation’s best. They were disinterested world experts in other subjects, such as statistics. And their report recommending against polygraph screenings in no uncertain terms was so persuasive the Department of Energy called co-chair Fienberg the night before his scheduled Congressional testimony to announce it was drawing down its proposed polygraph program expansion by half.
For its part, Congress was persuaded that there was a serious security problem with the programs, pressuring DoE — unsuccessfully — to draw down more. Yet, the next few years instead saw massive expansions in polygraph screening and similar programs. What happened?
The disconnect might be partly explained by security agencies lying to Congress about these programs, as I’ll be documenting in part 3 of this series. It might have been, too, the power of fear — and the appeal of using any tool at our disposal, even one that doesn’t work and does harm, to exert control in response to that feeling of fear. Or maybe Congress just got busy and forgot to address the problem of the security-threatening security programs. Congresspersons are swimming in information overload like the rest of us.
Whatever the case, with over a decade of perspective now on the national trauma of 9/11, one thing is clear. We can make best practices better to make people safer. We have the science to know how. We have the political system of checks and balances to get it done. And as we go forward continuing to heal from prejudice and violence — from our own mistakes as a country and others’ — we have more than enough political will to overhaul the way we do security as a country, to foster trust and promote truth and justice for all.