Thursday, February 7, 2013

How Far Is Too Far In Automated Weaponry?



How Far Is Too Far In Automated Weaponry?


­This article reports on the issue of technological advances involved in warfare; how it develops and how we fight. Seemingly flying under the radar, the advancement towards these totally autonomous weapons that will apparently be making life or death decisions independent of any human input is a frightening but not farfetched reality. Although there are certainly many positive consequences of these advances, including the avoidance of loss of life in military personnel on the side of the attacking nation, the potential negative consequences are too significant too ignore. The idea of remote and automated weaponry and equipment is not a new one; advances in technology inevitably have led to a decreased dependence on mandatory human infantry and presence in many different situations. But, the leap to fully autonomous drones, weaponry, or equipment is one that must be seriously considered from all angles. One of the most significant arguing points against autonomous weapons is the consciousness and potential for reason and guided decision making from the point of a human. This is not only a significant point but I think the pivotal and most important one. Because humans have a mind and a conscience that can weigh not only tactical situations but situations that are immersed in questions of human rights, I think that is a point that cannot be compromised. If drones are able to make battlefield decisions having the ultimate impact on civilians, people groups, and potentially innocent humans, I don't think we can comfortably pass this decision and judgment onto a machine designed with only mission success in "mind". In addition, it would be easier to pass blame or responsibility for any potential atrocities on humanity. Much like many of the human rights agreements today, because there is no real personal accountability in this realm, there will be an unclear boundary between what is right and wrong, and when a wrong occurs, who will be held accountable. This unclear boundary will most certainly be stretched to its limit, as virtually any boundary is, and the consequences of this should be heavily thought upon by all parties involved in progressing this technology to a worldwide reality. Although I certainly am in support of  eliminating the potential for loss of human life in using remote weaponry technology, I believe that we must weigh the gravity of this situation and its potential effect on humanity and human rights. 

1 comment:

  1. I agree with your opinions on this article. Autonomous weapon systems are very appealing in that the implementation of these systems would not be putting allied lives in danger. These robots would be able to judge a situation and detect targets and make battlefield decisions without putting lives in danger. However, as the article states, these robots “would be unable to distinguish adequately between combatants and civilians.” The potential situation of losing civilian lives as robots try and maneuver complicated battle situations is not worth it in the end. I believe that while militaries can place robots in battle situations, human oversight should always be in place to ensure minimal errors.
    As you stated, the loss of accountability is also a major issue that should be raised in the discussion. Since robots aren’t human, they can’t read into potential breaking of human rights agreements established. If a robot makes a miscalculation and accidentally kills a civilian, who is held accountable? The key difference between drones and robots is that with drones, people are behind the decisions and can decide when to fire. When people are in control of the drones, they are more liable to uphold human rights statutes. Like the author suggests, a ban on the “killer robots” would put a stop to the rise in the use of these weapons and potentially, the rise in unnecessary deaths of civilians. I believe that this issue should be discussed now before it becomes an issue in the future or even before it becomes too late. If a decision is made now to either ban or limit the use of the robots, it could shape the way the military and future wars develop.

    ReplyDelete