Military Review English Edition September-October 2014 - Page 80

Hiromichi Matsuda Atomic cloud over Nagasaki from Koyagi-jima, 9 August 1945. ignoring several indicators of likely negative secondand third-order effects, such as the proliferation of atomic weapons, the dehumanization of the Japanese people, and the expansion of the Soviet Union into East Asia, the Truman Administration pushed a complex situation over the cliff into chaos. Contemporary Macro-ethical Analysis: Drones Are we falling into a similar trap as we prosecute the war on terrorism? Are we attempting to answer complex questions with simple answers? In our just endeavor to defeat global terrorism, are we failing to see adverse second- and third-order effects of our tactical actions? Consider our use of drones. Use of drones is an example of a morally permissible tactical action that is producing a morally undesirable strategic outcome. Once again, it seems as if we are attempting to make decisions in an ordered-obvious domain while not grasping the complexity of the operational environment. The logic is deceptively simple, but seriously flawed: killing a legitimate target during war is a morally permissible act; killing a legitimate target, while safeguarding a nation’s forces, is morally permissible and fulfills a leader’s obligation to care for the troops; known terrorists are legitimate targets. It is simple: so, what is the problem? In response, if the object is to reduce the number of terrorists, what if the use of drones as a tactic is actually resulting in the producing of more terrorists while 78 also delegitimizing our global narrative with regard to holding the moral high ground? More terrorists would mean a longer war and more killing. Delegitimizing our narrative would go against strategic counterinsurgency goals by producing international and domestic outrage. Consequently, an action we might consider morally permissible at the tactical level would be producing results that ran counter to our overall strategic goals. If such are the actual results, the outcome would not be considered acceptable. Moreover, when taking into account perspectives of others, the action would be considered morally dubious. During the 2012 Fort Leavenworth Ethics Symposium, Dr. Daniel M. Bell addressed such issues in what he called the problems of distance as related to drones.32 He express ed concern that use of drones dehumanizes our enemies in the minds of our soldiers by creating what he termed “a PlayStation mentality.”33 Also, he said that drones may convey an impression of cowardice to those sympathizing with our enemies.34 Therefore, if killing is no more than a video game, we find ourselves in the middle of a slippery ethical slope.35 Bell discussed a topic he called “character and the profession of arms.”36 His thought-provoking conclusion was that by using drones, we are in danger of “technology replacing character.”37 According to Bell, technology is only as good as the people employing it.38 Furthermore, he said we (U.S. military leaders) stand in danger of becoming mere button pushers in a military led by “tactical generals and presidents.”39 He asked, “Who is thinking strategically?”40 Furthermore, Bell questioned whether technology has “economized our virtues.”41 He said that drones create “less room for profession, for judgment and virtues of professional soldiers.”42 He said we are at “risk of becoming mere technicians.”43 With these ideas in mind, does the use of drones atrophy our strategic judgment? What is the long-term strategic goal behind the long-distance killing of what are considered legitimate targets? Does this type of tactical action lead to achieving the strategic goal? Are we becoming complacent, using techniques suited for an obvious context while ignoring the complexity of the situation? Are we in danger of “falling over the cliff ” into chaos? Understanding this problem within a complex domain, we need to return to the Cynefin Framework’s September-October 2014  MILITARY REVIEW