Killer Robots In Warfare

They have no fear, they never tire, they are not upset when the soldier next to them gets blown to pieces. Their morale doesn't suffer by having to do, again and again, the jobs known in the military as the Three Ds - dull, dirty and dangerous.

They are military robots and their rapidly increasing numbers and growing sophistication may herald the end of thousands of years of human monopoly on fighting war. "Science fiction is moving to the battlefield. The future is upon us," as Brookings scholar Peter Singer put it to a conference of experts at the U.S. Army War College in Pennsylvania this month.

Singer just published Wired For War - the Robotics Revolution and Conflict in the 21st Century, a book that traces the rise of the machines and predicts that in future wars they will not only play greater roles in executing missions but also in planning them.

Numbers reflect the explosive growth of robotic systems. The U.S. forces that stormed into Iraq in 2003 had no robots on the ground. There were none in Afghanistan either. Now those two wars are fought with the help of an estimated 12,000 ground-based robots and 7,000 unmanned aerial vehicles (UAVs), the technical term for drone, or robotic aircraft.

Ground-based robots in Iraq have saved hundreds of lives in Iraq, defusing improvised explosive devices, which account for more than 40 percent of U.S. casualties. The first armed robot was deployed in Iraq in 2007 and it is as lethal as its acronym is long: Special Weapons Observation Remote Reconnaissance Direct Action System (SWORDS). Its mounted M249 machinegun can hit a target more than 3,000 feet away with pin-point precision.

From the air, the best-known UAV, the Predator, has killed dozens of insurgent leaders - as well as scores of civilians whose death has prompted protests both from Afghanistan and Pakistan.

The Predators are flown by operators sitting in front of television monitors in cubicles at Creech Air Force Base in Nevada, 8,000 miles from Afghanistan and Taliban sanctuaries on the Pakistani side of the border with Afghanistan. The cubicle pilots in Nevada run no physical risks whatever, a novelty for men engaged in war.

TECHNOLOGY RUNS AHEAD OF ETHICS

Reducing risk, and casualties, is at the heart of the drive for more and better robots. Ultimately, that means "fully autonomous engagement without human intervention," according to an Army communication to robot designers. In other words, computer programs, not a remote human operator, would decide when to open fire. What worries some experts is that technology is running ahead of deliberations of ethical and legal questions.

Robotics research and development in the U.S. received a big push from Congress in 2001, when it set two ambitious goals: by 2010, a third of the country's long-range attack aircraft should be unmanned; and by 2015 one third of America's ground combat vehicles. Neither goal is likely to be met but the deadline pushed non-technological considerations to the sidelines.

A recent study prepared for the Office of Naval Research by a team from the California Polytechnic State University said that robot ethics had not received the attention it deserved because of a "rush to market" mentality and the "common misconception" that robots will do only what they have been programmed to do.

"Unfortunately, such a belief is sorely outdated, harking back to the time when computers were simpler and their programs could be written and understood by a single person," the study says. "Now programs with millions of lines of code are written by teams of programmers, none of whom knows the entire program; hence, no individual can predict the effect of a given command with absolute certainty since portions of programs may interact in unexpected, untested ways."

That's what might have happened during an exercise in South Africa in 2007, when a robot anti-aircraft gun sprayed hundreds of rounds of cannon shell around its position, killing nine soldiers and injuring 14.

Beyond isolated accidents, there are deeper problems that have yet to be solved. How do you get a robot to tell an insurgent from an innocent? Can you program the Laws of War and the Rules of Engagement into a robot? Can you imbue a robot with his country's culture? If something goes wrong, resulting in the death of civilians, who will be held responsible?

The robot's manufacturer? The designers? Software programmers? The commanding officer in whose unit the robot operates? Or the U.S. president who in some cases authorises attacks? (Barack Obama has given the green light to a string of Predator strikes into Pakistan).

While the United States has deployed more military robots - on land, in the air and at sea - than any other country, it is not alone in building them. More than 40 countries, including potential adversaries such as China, are working on robotics technology. Which leaves one to wonder how the ability to send large numbers of robots, and fewer soldiers, to war will affect political decisions on force versus diplomacy.

You need to be an optimist to think that political leaders will opt for negotiation over war once combat casualties come home not in flag-decked coffins but in packing crates destined for the robot repair shop.

JAS

Inventor, Technologist, Futurist.

http://www.evilrobot.com
Previous
Previous

ROBOTS WITH FEELINGS

Next
Next

Weapons That Target Your Brain