The Ethics of Unmanned Warfare and the Truman Conundrum
“It is important for everybody to understand that this thing is kept on a very tight leash. It’s not a bunch of folks in a room somewhere just making decisions.” – President Barack Obama
The first instance of weapons of the computer age for many was the grainy nosecone pictures produced by American smart bombs striking Iraqi targets in 1991. The images left many amazed and asking the question, “do we [mankind] really have the technology to do that?” Lurch forward another ten years to the use of drone aircraft in the Afghan War. The euphoria is over. The technology is an established part of the U.S. arsenal and the use of drones in Afghanistan and Pakistan is reduced to scroll at the bottom of the news screen.
American military technology continues advancing as all technology does. The rise of these machines is accompanied by a multitude of questions, important to consider as these weapons are developed and change warfare. The American public reacted indifferently to the news of an unmanned drone campaign in Pakistan. Very little is being said in opposition to it.
It appears as if the idea of military action without risking lives is a palatable dish for the American public. There has only been scant domestic criticism. The week of May 27th it was disclosed that there was a “kill list” being used in the drone attacks and being overseen by the Obama administration. Yet, this will probably yield very little reaction.
Americans see little difference between this and the Ten Most Wanted List published by the U.S. FBI (Federal Bureau of Investigation) or the packs of playing cards that were circulated during the first Gulf War, each one featuring the picture of a different high profile member of Saddam Hussein’s regime. These instances, like the drone attacks in Pakistan, are viewed as preemptive strikes against a known terrorist group. This nonchalance brings the subject of unmanned warfare to the center of the debate. Unmanned, aerial drones have proven to be effective weapons, but what would the reaction be to implementing other unmanned combat machines, perhaps unmanned ground fighting vehicles?
Technology has jumped to the forefront of military development. The pictures of the MQ1 “Predator” and MQ9 “Reaper” have become “the most iconic” of American military operations. These two platforms have become mainstays of the American military, accounting for 30% of the military aircraft fleet. Though procurement of the earlier “Predator” ended, the more lethal “Reaper” is still produced and purchased by the U.S. armed forces.
The expansion of the advanced unmanned technology is not confined to the skies. There are several unmanned ground weapons in development. The Modular Advanced Armed Robotic System (MAARS) produced by QinetiQ North America, is a multi-use machine that can be used as an armed early warning system, equipped with a full complement of cameras and motion detectors. The “Gladiator” TUGV was an Unmanned Ground Vehicle (UGV) in development by the U.S. Marine Corps and Carnegie Melon University National Robotics Engineering Center. The program was unfunded in 2007, but not before producing a machine that can carry a variety of weapons into combat. Far more advanced is the “Black Knight” a 12-ton combat vehicle armed with a 30mm automatic cannon. All of these machines are unmanned combat platforms that present the U.S. and allied countries with serious questions.
The development of these weapons presents military policymakers with a solution and a new set of problems all at once. Critics of war, especially in the United States, banter about the “blood and treasure” that has been sacrificed. Unmanned combat machines remove the blood and are, in reality, a financial windfall. The economics of the situation hold greater relevance in a time when many Western countries are facing tight budgets and cuts in their militaries. These machines are expensive, yet not when the full scope of the programs is compared to the training and cost associated with a standing or even reserve force. The typical soldier is paid a salary, receives healthcare, food, clothing and housing allowance just to start. Additionally, soldiers are human and fatigue. They need sleep and time away from forward combat areas. Unmanned fighting platforms require little of this. Many will question: what of the human element of judgment and morality.
It is important to note that these unmanned machines are still controlled by a human, but a human that is not in harm’s way. Would a nation be more willing to go to war knowing the casualties would be minimal by all standards?
Unfortunately, the answer is yes. This is evident in the lack of criticism concerning U.S. drone sorties over Pakistan. Pivot from that question to one of responsibility. Doesn’t a country like the U.S. have an obligation to show restraint, especially against countries without such technology? Thus, we face what I refer to as the “Truman Conundrum.”
The Truman Conundrum is the choice countries and leaders make between using massive force and advanced technology to save the lives of their combatants. U.S. President Truman understood the staggering number of casualties that the American forces would have taken with an assault on mainland Japan at the end of World War Two. Hence, Truman made the decision to use nuclear devices and defeat the Japanese empire. The nuclear devices killed many Japanese, including civilians, but saved American lives.
The use of U.S. atomic weapons is comparable to the use of drones in Pakistan. The decision to use advanced weapons is historically simple and academic for leaders. It is important to point out to ethicists and others who would debate the use of unmanned military technology that this new generation of weapons is already in use. The debate about the fairness and humanity in using them is over. The decision has already been made.