The Banality of Systemic Evil
In recent months there has been a visible struggle in the media to come to grips with the leaking, whistle-blowing and hacktivism that has vexed the United States military and the private and government intelligence communities. This response has run the gamut. It has involved attempts to condemn, support, demonize, psychoanalyze and in some cases canonize figures like Aaron Swartz, Jeremy Hammond, Chelsea Manning and Edward Snowden.
In broad terms, commentators in the mainstream and corporate media have tended to assume that all of these actors needed to be brought to justice, while independent players on the Internet and elsewhere have been much more supportive. Tellingly, a recent Time magazine cover story has pointed out a marked generational difference in how people view these matters: 70 percent of those age 18 to 34 sampled in a poll said they believed that Snowden "did a good thing" in leaking the news of the National Security Agency's surveillance program.
So has the younger generation lost its moral compass?
No. In my view, just the opposite.
Clearly, there is a moral principle at work in the actions of the leakers, whistle-blowers and hacktivists and those who support them. I would also argue that that moral principle has been clearly articulated, and it may just save us from a dystopian future.
In "Eichmann in Jerusalem," one of the most poignant and important works of 20th-century philosophy, Hannah Arendt made an observation about what she called "the banality of evil." One interpretation of this holds that it was not an observation about what a regular guy Adolf Eichmann seemed to be, but rather a statement about what happens when people play their "proper" roles within a system, following prescribed conduct with respect to that system, while remaining blind to the moral consequences of what the system was doing -- or at least compartmentalizing and ignoring those consequences.
A good illustration of this phenomenon appears in "Moral Mazes," a book by the sociologist Robert Jackall that explored the ethics of decision making within several corporate bureaucracies. In it, Jackall made several observations that dovetailed with those of Arendt. The mid-level managers that he spoke with were not "evil" people in their everyday lives, but in the context of their jobs, they had a separate moral code altogether, what Jackall calls the "fundamental rules of corporate life":
(1) You never go around your boss. (2) You tell your boss what he wants to hear, even when your boss claims that he wants dissenting views. (3) If your boss wants something dropped, you drop it. (4) You are sensitive to your boss's wishes so that you anticipate what he wants; you don't force him, in other words, to act as a boss. (5) Your job is not to report something that your boss does not want reported, but rather to cover it up. You do your job and you keep your mouth shut.
Jackall went through case after case in which managers violated this code and were drummed out of a business (for example, for reporting wrongdoing in the cleanup at the Three Mile Island nuclear power plant).
Aaron Swartz counted "Moral Mazes" among his "very favorite books." Swartz was the Internet wunderkind who was hounded by a government prosecution threatening him with 35 years in jail for illicitly downloading academic journals that were behind a pay wall. Swartz, who committed suicide in January at age 26 (many believe because of his prosecution), said that "Moral Mazes" did an excellent job of "explaining how so many well-intentioned people can end up committing so much evil."
Swartz argued that it was sometimes necessary to break the rules that required obedience to the system in order to avoid systemic evil. In Swartz's case the system was not a corporation but a system for the dissemination of bottled up knowledge that should have been available to all. Swartz engaged in an act of civil disobedience to liberate that knowledge, arguing that "there is no justice in following unjust laws. It's time to come into the light and, in the grand tradition of civil disobedience, declare our opposition to this private theft of public culture."
Chelsea Manning, the United States Army private incarcerated for leaking classified documents from the Departments of Defense and State, felt a similar pull to resist the internal rules of the bureaucracy. In a statement at her trial she described a case where she felt this was necessary. In February 2010, she received a report of an event in which the Iraqi Federal Police had detained 15 people for printing "anti-Iraqi" literature. Upon investigating the matter, Manning discovered that none of the 15 had previous ties to anti-Iraqi actions or suspected terrorist organizations. Manning had the allegedly anti-Iraqi literature translated and found that, contrary to what the federal police had said, the published literature in question "detailed corruption within the cabinet of Prime Minister Nuri Kamal al-Maliki's government and the financial impact of his corruption on the Iraqi people."
When Manning reported this discrepancy to the officer in charge (OIC), she was told to "drop it," she recounted.
Manning could not play along. As she put it, she knew if she "continued to assist the Baghdad Federal Police in identifying the political opponents of Prime Minister al-Maliki, those people would be arrested and in the custody of the Special Unit of the Baghdad Federal Police and very likely tortured and not seen again for a very long time -- if ever." When her superiors would not address the problem, she was compelled to pass this information on to WikiLeaks.
Snowden too felt that, confronting what was clearly wrong, he could not play his proper role within the bureaucracy of the intelligence community. As he put it,
[W]hen you talk to people about [abuses] in a place like this where this is the normal state of business people tend not to take them very seriously and move on from them. But over time that awareness of wrongdoing sort of builds up and you feel compelled to talk about [them]. And the more you talk about [them] the more you're ignored. The more you're told it's not a problem until eventually you realize that these things need to be determined by the public and not by somebody who was simply hired by the government.
The bureaucracy was telling him to shut up and move on (in accord with the five rules in "Moral Mazes"), but Snowden felt that doing so was morally wrong.
In a June Op-Ed in The Times, David Brooks made a case for why he thought Snowden was wrong to leak information about the Prism surveillance program. His reasoning cleanly framed the alternative to the moral code endorsed by Swartz, Manning and Snowden. "For society to function well," he wrote, "there have to be basic levels of trust and cooperation, a respect for institutions and deference to common procedures. By deciding to unilaterally leak secret N.S.A. documents, Snowden has betrayed all of these things."
The complaint is eerily parallel to one from a case discussed in "Moral Mazes," where an accountant was dismissed because he insisted on reporting "irregular payments, doctored invoices, and shuffling numbers." The complaint against the accountant by the other managers of his company was that "by insisting on his own moral purity ... he eroded the fundamental trust and understanding that makes cooperative managerial work possible."
But wasn't there arrogance or hubris in Snowden's and Manning's decisions to leak the documents? After all, weren't there established procedures determining what was right further up the organizational chart? Weren't these ethical decisions better left to someone with a higher pay grade? The former United States ambassador to the United Nations, John Bolton, argued that Snowden "thinks he's smarter and has a higher morality than the rest of us ... that he can see clearer than other 299, 999, 999 of us, and therefore he can do what he wants. I say that is the worst form of treason."
For the leaker and whistleblower the answer to Bolton is that there can be no expectation that the system will act morally of its own accord. Systems are optimized for their own survival and preventing the system from doing evil may well require breaking with organizational niceties, protocols or laws. It requires stepping outside of one's assigned organizational role. The chief executive is not in a better position to recognize systemic evil than is a middle level manager or, for that matter, an IT contractor. Recognizing systemic evil does not require rank or intelligence, just honesty of vision.
Persons of conscience who step outside their assigned organizational roles are not new. There are many famous earlier examples, including Daniel Ellsberg (the Pentagon Papers), John Kiriakou (of the Central Intelligence Agency) and several former N.S.A. employees, who blew the whistle on what they saw as an unconstitutional and immoral surveillance program (William Binney, Russ Tice and Thomas Drake, for example). But it seems that we are witnessing a new generation of whistleblowers and leakers, which we might call generation W (for the generation that came of age in the era WikiLeaks, and now the war on whistleblowing).
The media's desire to psychoanalyze members of generation W is natural enough. They want to know why these people are acting in a way that they, members of the corporate media, would not. But sauce for the goose is sauce for the gander; if there are psychological motivations for whistleblowing, leaking and hacktivism, there are likewise psychological motivations for closing ranks with the power structure within a system -- in this case a system in which corporate media plays an important role. Similarly it is possible that the system itself is sick, even though the actors within the organization are behaving in accord with organizational etiquette and respecting the internal bonds of trust.
Just as Hannah Arendt saw that the combined action of loyal managers can give rise to unspeakable systemic evil, so too generation W has seen that complicity within the surveillance state can give rise to evil as well -- not the horrific evil that Eichmann's bureaucratic efficiency brought us, but still an Orwellian future that must be avoided at all costs. Peter Ludlow
Peter Ludlow is a professor of philosophy at Northwestern University and writes frequently on digital culture, hacktivism and the surveillance state.