sábado, 24 de noviembre de 2012

The Future of Global Warfare: Killer Robots

Human Rights Watch.

Despite a lack of public awareness and public debate a number of governments, including European states, are pushing forward with the development of fully autonomous weapons - also known as killer robots. These are weapon systems that will function without any human intervention. The armed robot itself will select its target and will determine when to fire. This is a frighteningly dangerous path to follow in terms of the need to protect civilians during armed conflict.

Killer robots would be unable to distinguish adequately between combatants and civilians in the increasingly complex circumstances of modern battlefields, and would be unable to make proper proportionality determinations. That is, whether the military advantages of an attack exceed the potential harm to civilians. Giving machines the power to decide who lives and dies on the battlefield would take technology too far. Killer robots would lack the human qualities necessary to protect civilians and comply with international humanitarian law. They would lack the ability to relate to humans and to apply human judgment.

They would also create an accountability gap, as it would be unclear who should be held responsible for the inevitable violations of the law that would occur. Fully autonomous weapons do not yet exist, though precursors do. The precursors demonstrate the rapid movement toward autonomy and replacing humans on the battlefield. The United States is the most active in developing these technologies but others include China, Germany, Israel, Russia, South Korea and the United Kingdom.

Sophisticated fully autonomous weapons may be fielded within 20 or 30 years, according to many experts; some of whom indicate that cruder versions could be available much sooner - in a matter of years not decades. Armed drones are not fully autonomous weapons and not part of the call for a ban. Human Rights Watch has extensively criticised the way drones have been used - for extrajudicial killings for example - but the key issue with drones is not the nature of the weapon, as it is with fully autonomous weapons. Drones have a 'man-in-the-loop' with a human remotely selecting the target and deciding when to fire. With killer robots, the human is out-of-the-loop and the machine determines what to attack and when.

We have has just released the first in-depth report by a non-governmental organisation looking at this issue Losing humanity: the case against killer robots. We conclude that these weapons would not be able to comply with international humanitarian law standards and would pose unacceptable dangers to civilians. And we are calling for a pre-emptive ban on the development, production, and use of fully autonomous weapons. Governments should enact such a ban at the national level, as a stepping stone to an international treaty with a comprehensive prohibition.

Link.