Danlina wrote:KNOWING that one of the advancements a nation can make in technology is Artificial Intelligence, and that technologies such as this are used in Lethal Autonomous Weapon Systems (LAWSs).
OOC: This needs to be in the active clauses as a definition, not in the preamble. Also, artificial intelligence refers to human-grade-or-higher intelligence, so think of a different wording, given you're specifically not talking about those.
RECOGNIZING the immorality embedded in giving a machine full responsibility over life and death situations on the battlefield.
This is also irrelevant, given you're not doing that.
BELIEVING that there should always be some degree of sapient intervention in offensive weapons systems.
Hence mentioning AIs in the first preamble clause is a bad idea and confuses everyone.
RESOLVING that there should be international monitoring and restrictions set on LAWSs.
Why?
On this matter, the GA thus declares that, except as limited by earlier resolutions that are still in place:
Start the whole proposal with "The World Assembly" or "The General Assembly", and replace this line with "Hereby, [excemption goes here]".
LAWSs are defined as weapons systems that operate independently from their sapient users while being used and are programmed to engage in lethal tactics.
Spell out the full words - basically move the thing from preamble to here. Also, if they're fully automated, they don't really have "users" as such, now do they? Deployers, maybe?
The World Assembly mandates boycotting the manufacturing of and use of offensive LAWSs on the battlefield to all of its members.
You still haven't explained what makes them so bad that they would need to be banned. Also, if you're allowing their use defensively, why are you requiring "boycotting the manufacturing of"? And why boycotting? Why not just banning?
The World Assembly mandates that to be manufactured, sold, bought, transferred and used by its members, a LAWS must have the following qualities:
- Can explain their reasoning and decisions to sapient operators in transparent and understandable ways.
- Have responsible sapient operators who are clearly identifiable.
- Have autonomous functions that are predictable to their operators.
- Be manufactured for defensive purposes and used solely by defensive units.
- Fit into previous laws of humane weaponry.
But their manufacture must still be boycotted? Also, if you've got a battlefield, and you put an "automated machine gun" (your choice of example!) there to defend your own people, and move it forward as you advance, is it still defensive use? And do aerial drones carrying missiles count? You say in the thread that guided missiles don't count, but I'm not seeing anything here that would make them not count. Especially as missiles are hardly either humane or defensive use items.