Tuesday, April 30, 2013

Heather Roff Blog: How Automated Wars Rob Us Of Humanity





Jack Kirby's The Destroyer, from the Mighty Thor.
  Hannah Arendt once used the phrase "the banality of evil" to describe the character of Adolph Eichmann's acquiescence in committing atrocities for the Nazi regime.

What this phrase means, in Eichmann's case, is that it was his "sheer thoughtlessness -- something by no means identical with stupidity -- that predisposed him to become one of the greatest criminals of that period."

 Indeed,it is "that such remoteness from reality and such thoughtlessness can wreak more havoc than all the evil instincts taken together," that evil is in this sense banal, means that there is no thought -- no decision -- to be (or to act) evil. It is so commonplace, and it is a lack of thinking that results in the most horrific of actions.

Thus Eichmann's most dangerous element was that he threw away what it meant to be human -- he threw away his capacity for rational thought and reflection on right and wrong, good and evil.

We are at a similar juncture with regards to a "lack of thinking." In our case, however, it is in regards to the delegation of thinking to a machine, and a lethal machine in particular. What I mean here is that militaries, and the U.S. military in particular, envisions a future where weapons do the thinking -- that is, planning, target selection and engagement.

Already the U.S. military services have capabilities that enable weapons to seek out and queue targets, such as the F-35 joint fighter and some targeting software platforms on tanks, like the M1 Abrams, as well as seeking out targets and automatically engaging them, like Phalanx or Counter Rocket, Artillery and Mortar (CRAM) systems.

The U.S.' decision to rely on unmanned aerial vehicles, or "drones," admits to the appeal of fighting at a distance with the use of automated technology. The current drones in combat operations, such as the Predator and Reaper, show the ease with which killing by remote can be accomplished.

While drones are certainly problematic, from a legal and moral standpoint in regards to targeted killings, human beings still ultimately control this type of technology. Human pilots are in the "cockpit," and for better (or worse) there are human beings making targeting decisions.

The worry, however, is that militaries are planning to push autonomy further than the F-35 joint striker (which is far more autonomous than the Predator or Reaper) to "fully autonomous" weapons.

Moreover, while we might try to push this worry aside and claim that it is a long way off, or too futuristic, we cannot deny the middle term between now and "fully autonomous" weapons. In this middle term, the warfighter will become increasingly dependent upon such technologies to fight.

Indeed, we already see this in "automation bias" (or the over-reliance on information generated by an automated process as a replacement for vigilant information seeking and processing). With increased dependence on the technology, this automation bias will only increase and thus will lead to a degeneration of not only strategic thinking in the services, but like the case of Eichmann, a lack of thinking more generally.

The evil here is that through the banality of autonomy, we risk not only creating a class of unthinking warfighters, but that the entire business of making war becomes so removed from human judgment and critical thinking that it too becomes commonplace.

In fact, it might become so banal, so removed from human agency, that even the word "war" starts to lose meaning. For what would we call a conflict where one side, or both, hands over the "thinking" to a machine, doesn't risk its soldiers' lives, and perhaps doesn't even place human beings outside of its own borders to fight? "War" does not really seem to capture what is going on here.

The danger, of course, is that conflicts of this type might not only perpetuate asymmetric violence, but that it further erodes the very foundations of humanity. In other words, if we are not careful about the increasing push towards autonomous weapons, we risk vitiating the thinking, judging and thus rational capacity of humanity.

What was once merely automation bias becomes the banality of autonomy, and in an ironic twist, humans lose their own ability to be "autonomous."

The human warfighter is now the drone.



No one could drawn mind-blowing technology quite like Jack Kirby.

"The release of atom power has changed everything except our way of thinking. The solution to this problem lies in the heart of mankind. If only I had known, I would have become a watchmaker." -- Albert Einstein






No comments:

Post a Comment