NATION

PASSWORD

[DRAFT] A Resolution to Ban the Use of Autonomous Weapons

Where WA members debate how to improve the world, one resolution at a time.

Advertisement

Remove ads

User avatar
Separatist Peoples
GA Secretariat
 
Posts: 16989
Founded: Feb 17, 2011
Left-Leaning College State

Postby Separatist Peoples » Sun May 10, 2015 5:09 am

Imperium Anglorum wrote:
Separatist Peoples wrote:Ending life is exactly what those weapons are for. Perhaps the lack of efficiency is an acceptable casualty to you, but not to us.

Man! You should've said something like that during all those discussions about regulating war!

"Not once did my resolutions on war attempt to regulate types of weapons or what amounts to mistakes. I sought to regulate deliberate targeting of those who should not be targeted. There is a large difference between the two."

His Worshipfulness, the Most Unscrupulous, Plainly Deceitful, Dissembling, Strategicly Calculating Lord GA Secretariat, Authority on All Existence, Arbiter of Right, Toxic Globalist Dog, Dark Psychic Vampire, and Chief Populist Elitist!
Separatist Peoples should RESIGN!

User avatar
Athretvari
Diplomat
 
Posts: 574
Founded: Apr 29, 2012
New York Times Democracy

Postby Athretvari » Sun May 10, 2015 7:26 am

Image


The Athretvar Ai view this proposal as a direct existential threat to the security and defense of all non-human and advanced AI civilizations.

Our government and cyberbeings humbly request that all enlightened Organic civilizations reject this assault upon the inalienable rights of all species—human, non-human and AI—to self-defense.
Athretvari
The Realms Banner (flag)
Yeah… I know. It’s a tough one. You can skip

User avatar
Terricon
Spokesperson
 
Posts: 176
Founded: Mar 01, 2015
Ex-Nation

Postby Terricon » Sun May 10, 2015 7:28 am

For, I wish to see all military and non military drones banned.
Robespierre FTW

User avatar
Senkaku
Postmaster of the Fleet
 
Posts: 26717
Founded: Sep 01, 2012
Corrupt Dictatorship

Postby Senkaku » Sun May 10, 2015 8:25 am

As drones, robots, and AI systems form the backbone of the Shanhe Kurun's military and government, we wish to voice our strong disapproval. Such a bill would cripple the established and functioning governments of many states that use AI systems for governance purposes, and pose a serious threat to the national security of all states that use automated weapons systems for their defense.
Biden-Santos Thought cadre

User avatar
Excidium Planetis
Powerbroker
 
Posts: 8067
Founded: May 01, 2014
Ex-Nation

Postby Excidium Planetis » Sun May 10, 2015 9:11 am

Terricon wrote:For, I wish to see all military and non military drones banned.


Do you realize this proposal would complete destroy the capability of thousands of nations to wage war? How many non-human civilizations would not be able to defend themselves under these rules? Not even the pony nations would be able to defend themselves, as they'd need a human's decision to fire a weapon! And that is only one example of the many types of countries left defenseless under these unnecessary regulations.
Current Ambassador: Adelia Meritt
Ex-Ambassador: Cornelia Schultz, author of GA#355 and GA#368.
#MakeLegislationFunnyAgain
Singaporean Transhumans wrote:You didn't know about Excidium? The greatest space nomads in the NS multiverse with a healthy dose (read: over 9000 percent) of realism?
Saveyou Island wrote:"Warmest welcomes to the Assembly, ambassador. You'll soon learn to hate everyone here."
Imperium Anglorum wrote:Digital Network Defence is pretty meh
Tier 9 nation, according to my index.Made of nomadic fleets.


News: AI wins Dawn Fleet election for High Counselor.

User avatar
Dooom35796821595
Powerbroker
 
Posts: 9309
Founded: Sep 11, 2011
Father Knows Best State

Postby Dooom35796821595 » Sun May 10, 2015 10:56 am

There's a lot of whereas in this resolution, and the only actionable clause is Prohibiting the use and development of autonomous weapons, without defining what it would and would not cover, what a autonomous weapon is, not to mention the mentioning of humans in such a context suggests that any non human would need human authorisation to carry out any level of military operations.

Not to mention the fact that there are lots of scenarios where a cold calculating machine guided only by simple programming can be more advantageous then the hesitations and uncertainties of higher thinking beings. Like a automated turret that targets armed hostiles attempting to cross a DMZ, or point guards in urban warfare.
When life gives you lemons, you BURN THEIR HOUSE DOWN!
Anything can be justified if it is cool. If at first you don't succeed, destroy all in your way.
"Your methods are stupid! Your progress has been stupid! Your intelligence is stupid! For the sake of the mission, you must be terminated!”

User avatar
Separatist Peoples
GA Secretariat
 
Posts: 16989
Founded: Feb 17, 2011
Left-Leaning College State

Postby Separatist Peoples » Sun May 10, 2015 11:15 am

Dooom35796821595 wrote:There's a lot of whereas in this resolution, and the only actionable clause is Prohibiting the use and development of autonomous weapons, without defining what it would and would not cover, what a autonomous weapon is, not to mention the mentioning of humans in such a context suggests that any non human would need human authorisation to carry out any level of military operations.

Not to mention the fact that there are lots of scenarios where a cold calculating machine guided only by simple programming can be more advantageous then the hesitations and uncertainties of higher thinking beings. Like a automated turret that targets armed hostiles attempting to cross a DMZ, or point guards in urban warfare.

"Indeed. Machines are also immune to the high-stress mental breakdowns that cause most serious war crimes (OOC L: like the My Lai massacre) that non-machines are often succeptable to. And we sure wouldn't want to lose our chemical weapons disposal droids, what with their ability to decide the best time to detonate disposal ordinances based on immediate and projected micro-meteorological data collection. As the explosive disposal charges are the same compounds we use I some conventional weapons, we'd be forced to risk lives to achieve the same ends."

His Worshipfulness, the Most Unscrupulous, Plainly Deceitful, Dissembling, Strategicly Calculating Lord GA Secretariat, Authority on All Existence, Arbiter of Right, Toxic Globalist Dog, Dark Psychic Vampire, and Chief Populist Elitist!
Separatist Peoples should RESIGN!

User avatar
Blaccakre
Attaché
 
Posts: 87
Founded: Apr 14, 2015
Ex-Nation

Postby Blaccakre » Sun May 10, 2015 11:40 am

Separatist Peoples wrote:
Dooom35796821595 wrote:There's a lot of whereas in this resolution, and the only actionable clause is Prohibiting the use and development of autonomous weapons, without defining what it would and would not cover, what a autonomous weapon is, not to mention the mentioning of humans in such a context suggests that any non human would need human authorisation to carry out any level of military operations.

Not to mention the fact that there are lots of scenarios where a cold calculating machine guided only by simple programming can be more advantageous then the hesitations and uncertainties of higher thinking beings. Like a automated turret that targets armed hostiles attempting to cross a DMZ, or point guards in urban warfare.

"Indeed. Machines are also immune to the high-stress mental breakdowns that cause most serious war crimes (OOC L: like the My Lai massacre) that non-machines are often succeptable to. And we sure wouldn't want to lose our chemical weapons disposal droids, what with their ability to decide the best time to detonate disposal ordinances based on immediate and projected micro-meteorological data collection. As the explosive disposal charges are the same compounds we use I some conventional weapons, we'd be forced to risk lives to achieve the same ends."

Yes, murderous death machines are all well and good until that unflinching kill bot liquidates a bus full of children that fits its "threat" parameters. But, its all good, because at least well know the machine was doing what it was designed to do and not having a "high-stress mental breakdown."
Last edited by Blaccakre on Sun May 10, 2015 11:41 am, edited 1 time in total.
The Glorious, Unparalleled, Doubleplusgood Kingdom of Blaccakre
"There is no justice, only the Law."

Any effort by World Assembly Census experts to label our glorious nation as "corrupt," or to claim that we have "short average lifespans" and "ignorant citizens," shall be treated as belligerent propaganda and will result in severe reprisal.

User avatar
United Soviet Soc American Republics
Civil Servant
 
Posts: 7
Founded: Apr 29, 2015
Ex-Nation

Postby United Soviet Soc American Republics » Sun May 10, 2015 11:44 am

I have no say in this, as a non-member, but more arms control=bad in my viewpoint, not going anywhere from there because have no say.

User avatar
Separatist Peoples
GA Secretariat
 
Posts: 16989
Founded: Feb 17, 2011
Left-Leaning College State

Postby Separatist Peoples » Sun May 10, 2015 11:53 am

Blaccakre wrote:Yes, murderous death machines are all well and good until that unflinching kill bot liquidates a bus full of children that fits its "threat" parameters. But, its all good, because at least well know the machine was doing what it was designed to do and not having a "high-stress mental breakdown."

"The appeal to emotion is a time honored, but ineffective approach, ambassador. I thought you better than that. You've completely misconstrued autonomous devices and drones in one fell swoop. A machine that, upon finding a coding error, goes into deathmode would be as dangerous to its' owners as enemies. No reasonable individual would program it thusly, or without an override or shutdown mode as an error protocol. Anything deliberately designed as such would fall afoul of Wartime Looting and Pillaging, and any violations are covered by a legal process. Care to try again with an informed argument, ambassador, or would you like to keep trying to play our heart strings like an off-key bard?"

His Worshipfulness, the Most Unscrupulous, Plainly Deceitful, Dissembling, Strategicly Calculating Lord GA Secretariat, Authority on All Existence, Arbiter of Right, Toxic Globalist Dog, Dark Psychic Vampire, and Chief Populist Elitist!
Separatist Peoples should RESIGN!

User avatar
Defwa
Minister
 
Posts: 2598
Founded: Feb 11, 2014
Ex-Nation

Postby Defwa » Sun May 10, 2015 12:00 pm

Blaccakre wrote:Yes, murderous death machines are all well and good until that unflinching kill bot liquidates a bus full of children that fits its "threat" parameters. But, its all good, because at least well know the machine was doing what it was designed to do and not having a "high-stress mental breakdown."

My problem with the author and yourself is I keep hearing things that apply to humans.

Except, while a human can be a psychopath and blow up a school bus and be absolutely undetectable until the last moment, a flaw in computer programming is a little more easily hammered out.
__________Federated City States of ____________________Defwa__________
Federation Head High Wizard of Dal Angela Landfree
Ambassadorial Delegate Maestre Wizard Mikyal la Vert

President and World Assembly Delegate of the Democratic Socialist Assembly
Defwa offers assistance with humanitarian aid, civilian evacuation, arbitration, negotiation, and human rights violation monitoring.

User avatar
Blaccakre
Attaché
 
Posts: 87
Founded: Apr 14, 2015
Ex-Nation

Postby Blaccakre » Sun May 10, 2015 12:01 pm

Separatist Peoples wrote:
Blaccakre wrote:Yes, murderous death machines are all well and good until that unflinching kill bot liquidates a bus full of children that fits its "threat" parameters. But, its all good, because at least well know the machine was doing what it was designed to do and not having a "high-stress mental breakdown."

"The appeal to emotion is a time honored, but ineffective approach, ambassador. I thought you better than that. You've completely misconstrued autonomous devices and drones in one fell swoop. A machine that, upon finding a coding error, goes into deathmode would be as dangerous to its' owners as enemies. No reasonable individual would program it thusly, or without an override or shutdown mode as an error protocol. Anything deliberately designed as such would fall afoul of Wartime Looting and Pillaging, and any violations are covered by a legal process. Care to try again with an informed argument, ambassador, or would you like to keep trying to play our heart strings like an off-key bard?"

I'm not talking about what happens when the kill bot malfunctions, I'm talking about what happens when it does it's job according to programming. Machines lack the capacity to make moral judgments; they just follow their programing. So if something fits all of its parameters as a threat, it's going to act on that regardless of whether a person would recognize that what looks like a threat really isn't. For example, a bus full of children hurdling through the DMZ might just be the product of an injured or confused driver. A machine programed to destroy any vehicle moving quickly into the DMZ is not going to recognize that and just going to blow shit up. By it's nature, it paints with a broad brush: does this fit my programing as a threat? If so, destroy.

This isn't an appeal to emotion, it's an appeal to common sense. We shouldn't let an robot decide who to kill because even the best programing isn't going to allow for the kind of complex judgements a moral actor can make.
The Glorious, Unparalleled, Doubleplusgood Kingdom of Blaccakre
"There is no justice, only the Law."

Any effort by World Assembly Census experts to label our glorious nation as "corrupt," or to claim that we have "short average lifespans" and "ignorant citizens," shall be treated as belligerent propaganda and will result in severe reprisal.

User avatar
Blaccakre
Attaché
 
Posts: 87
Founded: Apr 14, 2015
Ex-Nation

Postby Blaccakre » Sun May 10, 2015 12:04 pm

Defwa wrote:
Blaccakre wrote:Yes, murderous death machines are all well and good until that unflinching kill bot liquidates a bus full of children that fits its "threat" parameters. But, its all good, because at least well know the machine was doing what it was designed to do and not having a "high-stress mental breakdown."

My problem with the author and yourself is I keep hearing things that apply to humans.

Except, while a human can be a psychopath and blow up a school bus and be absolutely undetectable until the last moment, a flaw in computer programming is a little more easily hammered out.

Maybe we don't put psychopaths in charge of life and death situations either? I'm obviously not talking about replacing kill bots with psychopaths. But at least with the psychopath there's someone who can be held accountable to justice for the wrong they do.

And if it's so easy to find computer glitches prior to them causing problems, why are there still problems associated with computer gliches. Also, militaries do tend to vet their soldiers for psychological illness.
Last edited by Blaccakre on Sun May 10, 2015 12:05 pm, edited 1 time in total.
The Glorious, Unparalleled, Doubleplusgood Kingdom of Blaccakre
"There is no justice, only the Law."

Any effort by World Assembly Census experts to label our glorious nation as "corrupt," or to claim that we have "short average lifespans" and "ignorant citizens," shall be treated as belligerent propaganda and will result in severe reprisal.

User avatar
Separatist Peoples
GA Secretariat
 
Posts: 16989
Founded: Feb 17, 2011
Left-Leaning College State

Postby Separatist Peoples » Sun May 10, 2015 12:13 pm

Blaccakre wrote:I'm not talking about what happens when the kill bot malfunctions, I'm talking about what happens when it does it's job according to programming. Machines lack the capacity to make moral judgments; they just follow their programing. So if something fits all of its parameters as a threat, it's going to act on that regardless of whether a person would recognize that what looks like a threat really isn't. For example, a bus full of children hurdling through the DMZ might just be the product of an injured or confused driver. A machine programed to destroy any vehicle moving quickly into the DMZ is not going to recognize that and just going to blow shit up. By it's nature, it paints with a broad brush: does this fit my programing as a threat? If so, destroy.

This isn't an appeal to emotion, it's an appeal to common sense. We shouldn't let an robot decide who to kill because even the best programing isn't going to allow for the kind of complex judgements a moral actor can make.

"Tune those strings. As I mentioned, though clearly the subtly of my reference was lost on you, anything with a threat parameter so loose would be in violation of WL&P. And there is nothing to say that a soldier wouldn't do then same, since many guards are ordered to shoot on sight, and a school bus full of children has no business entering a restricted DMZ. It's an unrealistic example that boils down to "THINK OF THE CHILDREN!!!""

Blaccakre wrote:Maybe we don't put psychopaths in charge of life and death situations either?

"Nations routinely put that decision into the hands of nervous 19 year olds. One doesn't have to be a psychopath to make a bad call. A robot cannot make a bad call the same way at all."

I'm obviously not talking about replacing kill bots with psychopaths. But at least with the psychopath there's someone who can be held accountable to justice for the wrong they do.

"Command responsibility allows us to do exactly that with automatons."

And if it's so easy to find computer glitches prior to them causing problems, why are there still problems associated with computer gliches. Also, militaries do tend to vet their soldiers for psychological illness.

"Why are there still problems if militaries vet their troops for psychological stability? Flaws exist. Machines can be more easily corrected, and their benefits outweigh any cons."

His Worshipfulness, the Most Unscrupulous, Plainly Deceitful, Dissembling, Strategicly Calculating Lord GA Secretariat, Authority on All Existence, Arbiter of Right, Toxic Globalist Dog, Dark Psychic Vampire, and Chief Populist Elitist!
Separatist Peoples should RESIGN!

User avatar
Defwa
Minister
 
Posts: 2598
Founded: Feb 11, 2014
Ex-Nation

Postby Defwa » Sun May 10, 2015 12:16 pm

Blaccakre wrote:
Defwa wrote:My problem with the author and yourself is I keep hearing things that apply to humans.

Except, while a human can be a psychopath and blow up a school bus and be absolutely undetectable until the last moment, a flaw in computer programming is a little more easily hammered out.

Maybe we don't put psychopaths in charge of life and death situations either? I'm obviously not talking about replacing kill bots with psychopaths. But at least with the psychopath there's someone who can be held accountable to justice for the wrong they do.

And if it's so easy to find computer glitches prior to them causing problems, why are there still problems associated with computer gliches. Also, militaries do tend to vet their soldiers for psychological illness.

The man that perpetrated that shooting in that south Bigtopian all last week wasn't in a position of power. He still took the power to determine life or death on himself and utilized it to the tune of a dozen people.

And as I've said before, the fact that some nations are for some reason incapable of determining liability on their own is not a reason to ban autonomous weapons. But since that seems to be such an issue for you, I invite you to legislate on the matter.
In Defwa, should an autonomous weapons system take unexpected action with the effect of innocent death. A formal investigation is held. The manufacturers of the device, the creators of the program, the person who bought it, the team maintaining it, the group that provided the intelligence, the people monitoring that fleet are all looked at. Any weapons running of the same software are shut down until the cause can be determined. Fines, arrests, and discharges are doled out as needed.

In Defwa though, the number of errors has only decreased in the past twenty years due to these advances. So when you start making this sound like bands of rogue androids laying waste to society, I don't know what the hell you're talking about.

And psychological evaluations still only go so far. Here is a report of a Lilliputian military base shooting inspired by religious fanaticism.
Last edited by Defwa on Sun May 10, 2015 12:19 pm, edited 1 time in total.
__________Federated City States of ____________________Defwa__________
Federation Head High Wizard of Dal Angela Landfree
Ambassadorial Delegate Maestre Wizard Mikyal la Vert

President and World Assembly Delegate of the Democratic Socialist Assembly
Defwa offers assistance with humanitarian aid, civilian evacuation, arbitration, negotiation, and human rights violation monitoring.

User avatar
The United Federation of Galaxies
Secretary
 
Posts: 33
Founded: Apr 07, 2014
Ex-Nation

Postby The United Federation of Galaxies » Sun May 10, 2015 12:25 pm

Excidium Planetis wrote:
Terricon wrote:For, I wish to see all military and non military drones banned.


Do you realize this proposal would complete destroy the capability of thousands of nations to wage war? How many non-human civilizations would not be able to defend themselves under these rules? Not even the pony nations would be able to defend themselves, as they'd need a human's decision to fire a weapon! And that is only one example of the many types of countries left defenseless under these unnecessary regulations.


I have rewritten the proposal to allow SENTIENT beings to wage war. My proposal is directed towards so called "dumb" robots, or non-sentient robots.
Bernie 2016!
Policy debate is love, policy debate is life.
Pro: Climate change action, 1 world government, progressive taxation, aggressive foreign policy, right to choose, income equality.
Anti: Koch brothers, Fox News, right wing nutheads, Ted Cruz, climate deniers, agricultural subsidies.

User avatar
Separatist Peoples
GA Secretariat
 
Posts: 16989
Founded: Feb 17, 2011
Left-Leaning College State

Postby Separatist Peoples » Sun May 10, 2015 1:00 pm

The United Federation of Galaxies wrote:
Excidium Planetis wrote:
Do you realize this proposal would complete destroy the capability of thousands of nations to wage war? How many non-human civilizations would not be able to defend themselves under these rules? Not even the pony nations would be able to defend themselves, as they'd need a human's decision to fire a weapon! And that is only one example of the many types of countries left defenseless under these unnecessary regulations.


I have rewritten the proposal to allow SENTIENT beings to wage war. My proposal is directed towards so called "dumb" robots, or non-sentient robots.

"That will address the worst of the claims of speciesism, certainly."

His Worshipfulness, the Most Unscrupulous, Plainly Deceitful, Dissembling, Strategicly Calculating Lord GA Secretariat, Authority on All Existence, Arbiter of Right, Toxic Globalist Dog, Dark Psychic Vampire, and Chief Populist Elitist!
Separatist Peoples should RESIGN!

User avatar
Flibbleites
Retired Moderator
 
Posts: 6569
Founded: Jan 02, 2004
Ex-Nation

Postby Flibbleites » Sun May 10, 2015 2:12 pm

Blaccakre wrote:
Separatist Peoples wrote:"The appeal to emotion is a time honored, but ineffective approach, ambassador. I thought you better than that. You've completely misconstrued autonomous devices and drones in one fell swoop. A machine that, upon finding a coding error, goes into deathmode would be as dangerous to its' owners as enemies. No reasonable individual would program it thusly, or without an override or shutdown mode as an error protocol. Anything deliberately designed as such would fall afoul of Wartime Looting and Pillaging, and any violations are covered by a legal process. Care to try again with an informed argument, ambassador, or would you like to keep trying to play our heart strings like an off-key bard?"

I'm not talking about what happens when the kill bot malfunctions, I'm talking about what happens when it does it's job according to programming. Machines lack the capacity to make moral judgments; they just follow their programing. So if something fits all of its parameters as a threat, it's going to act on that regardless of whether a person would recognize that what looks like a threat really isn't. For example, a bus full of children hurdling through the DMZ might just be the product of an injured or confused driver. A machine programed to destroy any vehicle moving quickly into the DMZ is not going to recognize that and just going to blow shit up. By it's nature, it paints with a broad brush: does this fit my programing as a threat? If so, destroy.
You wouldn't have to blow up the bus to stop it, for example, the machine could shoot the driver, take out the tires, take out the engine, etc. And it's also possible that said bus is in actuality a car bomb on its way to kill a bunch of people. The fact is that a machine is going to be making its judgements based on the intel it's provided. If you give it garbage intel, you're going to get garbage results. That a basic computing adage, garbage in, garbage out.

Blaccakre wrote:This isn't an appeal to emotion, it's an appeal to common sense. We shouldn't let an robot decide who to kill because even the best programing isn't going to allow for the kind of complex judgements a moral actor can make.

I'm sorry, but you're using "Think of the children" as your argument, that's a textbook example of an appeal to emotion.

Bob Flibble
WA Representative

User avatar
Excidium Planetis
Powerbroker
 
Posts: 8067
Founded: May 01, 2014
Ex-Nation

Postby Excidium Planetis » Sun May 10, 2015 5:32 pm

The United Federation of Galaxies wrote:
Excidium Planetis wrote:
Do you realize this proposal would complete destroy the capability of thousands of nations to wage war? How many non-human civilizations would not be able to defend themselves under these rules? Not even the pony nations would be able to defend themselves, as they'd need a human's decision to fire a weapon! And that is only one example of the many types of countries left defenseless under these unnecessary regulations.


I have rewritten the proposal to allow SENTIENT beings to wage war. My proposal is directed towards so called "dumb" robots, or non-sentient robots.


That goes a long way to fixing your proposal, however there are still flaws that would prevent me from voting for it:
b. The lack of accountability if innocents are harmed or killed, and
c. The inability to feel empathy, compassion or mercy, or to factor those things into decision-making;
WHEREAS the potential harm posed by autonomous weapon systems to individuals and populations presents an unjustifiable risk to world-wide safety and security; and
WHEREAS developing autonomous weapon systems could spur an international arms race of unprecedented proportions, creating political turmoil and international strife;

The accountability is the accountability of the programmers. If their weapons screwed up, it is on them.

Humans can also be incapable of feeling emotion, but you do not object to them firing missiles. Additionally, mercy can be a bad thing in many cases (OOC: An American soldier had an opportunity to kill an Austrian Corporal who was limping off a battlefield in World War I. He showed the man mercy, and didn't pull the trigger. That man was Adolf Hitler, and was responsible for the deaths of millions of innocent lives).

The potential harm is less than giving guns to human operators capable of making mistakes.

It would not spur an arms race, since many nations (such as our own) already possess such weapons. It is tantamount to trying to ban assault rifles because they would spur an arms race.

Additionally, there is the problem I pointed out to you in my Telegram, that significant delays over the long distances of space would render most drone weapons obsolete, and we would have no choice but to resume manning our Bolt Class Starfighters, a move which would likely result in many more lives lost.
Current Ambassador: Adelia Meritt
Ex-Ambassador: Cornelia Schultz, author of GA#355 and GA#368.
#MakeLegislationFunnyAgain
Singaporean Transhumans wrote:You didn't know about Excidium? The greatest space nomads in the NS multiverse with a healthy dose (read: over 9000 percent) of realism?
Saveyou Island wrote:"Warmest welcomes to the Assembly, ambassador. You'll soon learn to hate everyone here."
Imperium Anglorum wrote:Digital Network Defence is pretty meh
Tier 9 nation, according to my index.Made of nomadic fleets.


News: AI wins Dawn Fleet election for High Counselor.

User avatar
Blaccakre
Attaché
 
Posts: 87
Founded: Apr 14, 2015
Ex-Nation

Postby Blaccakre » Sun May 10, 2015 6:42 pm

Flibbleites wrote:You wouldn't have to blow up the bus to stop it, for example, the machine could shoot the driver, take out the tires, take out the engine, etc. And it's also possible that said bus is in actuality a car bomb on its way to kill a bunch of people. The fact is that a machine is going to be making its judgements based on the intel it's provided. If you give it garbage intel, you're going to get garbage results. That a basic computing adage, garbage in, garbage out.

I think the new draft incorporates your concerns. I wouldn't have a problem with a purely defensive machine that autonomously shot out the tires or disabled the bus, but I have serious concerns over a machine deciding to kill folks. And if we have someone feeding the machine intel, why can't we also have that person making the ultimate decisions? Frankly, I'd be much happier with a machine feeding the person intel and the person making the decision than the other way around.

Flibbleites wrote:I'm sorry, but you're using "Think of the children" as your argument, that's a textbook example of an appeal to emotion.

I'm not using "think of the children" as my argument. My argument is we should not have machines making decisions about life and death, because they are not accountable and they lack moral perspective. My example involved a busload of children, but it could just as easily have been a busload of sheep, or a busload of banana cream pies, or a traveling band of young adults and their dog who solve mysteries. The point of my example is that an autonomous machine cannot be trusted with making the decision of whether to kill or injure people, for the reasons previously mentioned. I'm sorry you didn't like the example and felt that it was overly emotional, but that doesn't make my argument fallacious.

I do think the new draft addresses many of the concerns so far by significantly narrowing the focus of this law.
The Glorious, Unparalleled, Doubleplusgood Kingdom of Blaccakre
"There is no justice, only the Law."

Any effort by World Assembly Census experts to label our glorious nation as "corrupt," or to claim that we have "short average lifespans" and "ignorant citizens," shall be treated as belligerent propaganda and will result in severe reprisal.

User avatar
The United Federation of Galaxies
Secretary
 
Posts: 33
Founded: Apr 07, 2014
Ex-Nation

Postby The United Federation of Galaxies » Sun May 10, 2015 7:05 pm

Main arguments.

Protection of civilians. Bearing in mind that most of today’s armed conflicts are inter-state conflicts without clear boundaries between a variety of armed groups and civilians, it is questionable how a robot can be effectively programmed to avoid civilian casualties when humans themselves lack the ability to make distinctions in such conflict settings and face difficulties to overcome these dilemmas.

Proportionality. In certain situations, military attacks are not conducted due to the risk of causing disproportionally high civilian damages. It has been doubted that a robotic system is capable of making such decisions.

Accountability. With an autonomous weapon system, no individual human can be held accountable for his or her actions in an armed conflict. Instead the responsibility is distributed among a larger, possibly unidentifiable group of persons, including perhaps the programmer, or manufacturer of the robot.

Increasing the risk of war. Such technology increases the risk that states are more likely to engage in armed conflicts due to a reduced possibility of military causalities. Fully autonomous weapons could lower the threshold of war.

Cool calculators or tools of repression? Supporters of fully autonomous weapons argue that these systems would help overcome human emotions such as panic, fear, or anger, which lead to misjudgment and incorrect choices in stressful situations. However, opponents to the development of these weapon systems point out that this so-called advantage can turn into a massive risk to people who live in repressive state systems. Fully autonomous weapons could be used to oppress opponents without fearing protest, conscientious objection, or insurgency within state security forces.

Proliferation. There are concerns that fully autonomous weapon systems could fall into the hands of non-authorized persons.
Bernie 2016!
Policy debate is love, policy debate is life.
Pro: Climate change action, 1 world government, progressive taxation, aggressive foreign policy, right to choose, income equality.
Anti: Koch brothers, Fox News, right wing nutheads, Ted Cruz, climate deniers, agricultural subsidies.

User avatar
The United Federation of Galaxies
Secretary
 
Posts: 33
Founded: Apr 07, 2014
Ex-Nation

Postby The United Federation of Galaxies » Sun May 10, 2015 7:10 pm

Excidium Planetis wrote:
The United Federation of Galaxies wrote:
I have rewritten the proposal to allow SENTIENT beings to wage war. My proposal is directed towards so called "dumb" robots, or non-sentient robots.


That goes a long way to fixing your proposal, however there are still flaws that would prevent me from voting for it:
b. The lack of accountability if innocents are harmed or killed, and
c. The inability to feel empathy, compassion or mercy, or to factor those things into decision-making;
WHEREAS the potential harm posed by autonomous weapon systems to individuals and populations presents an unjustifiable risk to world-wide safety and security; and
WHEREAS developing autonomous weapon systems could spur an international arms race of unprecedented proportions, creating political turmoil and international strife;

The accountability is the accountability of the programmers. If their weapons screwed up, it is on them.

Humans can also be incapable of feeling emotion, but you do not object to them firing missiles. Additionally, mercy can be a bad thing in many cases (OOC: An American soldier had an opportunity to kill an Austrian Corporal who was limping off a battlefield in World War I. He showed the man mercy, and didn't pull the trigger. That man was Adolf Hitler, and was responsible for the deaths of millions of innocent lives).

The potential harm is less than giving guns to human operators capable of making mistakes.

It would not spur an arms race, since many nations (such as our own) already possess such weapons. It is tantamount to trying to ban assault rifles because they would spur an arms race.

Additionally, there is the problem I pointed out to you in my Telegram, that significant delays over the long distances of space would render most drone weapons obsolete, and we would have no choice but to resume manning our Bolt Class Starfighters, a move which would likely result in many more lives lost.



About accountability, programming is obviously no easy task considering the billions or even trillions of lines of code needed for such a complex system. With such a complex program with so many people involved in the design and programming of the AWS, how would you enforce accountability when it is spread over a much larger group versus if a soldier goes on a rampage in a civilian population, the soldier is the one responsible?

True, humans and other sentient beings can sometimes not feel emotion but my main point is that robots INHERENTLY CANNOT feel emotion.

I suppose I underestimated how many nations have these AWS.

About the long distance communication, shouldn't faster than light devices like the ansible mitigate that by bypassing the normal space time continuum?
Bernie 2016!
Policy debate is love, policy debate is life.
Pro: Climate change action, 1 world government, progressive taxation, aggressive foreign policy, right to choose, income equality.
Anti: Koch brothers, Fox News, right wing nutheads, Ted Cruz, climate deniers, agricultural subsidies.

User avatar
Shazbotdom
Postmaster-General
 
Posts: 11127
Founded: Sep 28, 2004
Anarchy

Postby Shazbotdom » Sun May 10, 2015 7:21 pm

The United Federation of Galaxies wrote:
Excidium Planetis wrote:
That goes a long way to fixing your proposal, however there are still flaws that would prevent me from voting for it:

The accountability is the accountability of the programmers. If their weapons screwed up, it is on them.

Humans can also be incapable of feeling emotion, but you do not object to them firing missiles. Additionally, mercy can be a bad thing in many cases (OOC: An American soldier had an opportunity to kill an Austrian Corporal who was limping off a battlefield in World War I. He showed the man mercy, and didn't pull the trigger. That man was Adolf Hitler, and was responsible for the deaths of millions of innocent lives).

The potential harm is less than giving guns to human operators capable of making mistakes.

It would not spur an arms race, since many nations (such as our own) already possess such weapons. It is tantamount to trying to ban assault rifles because they would spur an arms race.

Additionally, there is the problem I pointed out to you in my Telegram, that significant delays over the long distances of space would render most drone weapons obsolete, and we would have no choice but to resume manning our Bolt Class Starfighters, a move which would likely result in many more lives lost.



About accountability, programming is obviously no easy task considering the billions or even trillions of lines of code needed for such a complex system. With such a complex program with so many people involved in the design and programming of the AWS, how would you enforce accountability when it is spread over a much larger group versus if a soldier goes on a rampage in a civilian population, the soldier is the one responsible?

True, humans and other sentient beings can sometimes not feel emotion but my main point is that robots INHERENTLY CANNOT feel emotion.

I suppose I underestimated how many nations have these AWS.

About the long distance communication, shouldn't faster than light devices like the ansible mitigate that by bypassing the normal space time continuum?


"Ambassador, Programmers have their own style that they code in. One just needs to figure out who wrote the block of code that caused the malfunction and then deal with them in whatever manner is appropriate to either the Military Hierarchy or to whatever corporation was contracted to do the work. Also, logging who did what part of the programming also assists in determining who is responsible for the malfunction. This all should be basic principles that one should learn as early as their Freshman year in Secondary School. I also have yet to see you prove to this August Body that this is a problem that needs to be addressed."
Image
Mr. Antuan D. Flaberghast
Shazbotdom Ambassador to the WA
Last edited by Shazbotdom on Sun May 10, 2015 7:23 pm, edited 2 times in total.
ShazWeb || IIWiki || Discord: shazbertbot || 1 x NFL Picks League Champion (2021)
CosmoCast || SISA || CCD || CrawDaddy || SCIA || COPEC || Boudreaux's || CLS || SNC || ShazAir || BHC || TWO
NHL: NYR 1 - 0 WSH | COL 0 - 1 WPG | VGK 0 - 0 DAL || NBA: NOLA (8) 0 - 1 OKC (1)
NCAA MBB: Tulane 22-18 | LSU 25-16 || NCAA WSB: LSU 35-10

User avatar
The Northern Kingdoms
Diplomat
 
Posts: 634
Founded: Jan 26, 2015
Ex-Nation

Postby The Northern Kingdoms » Sun May 10, 2015 7:24 pm

I support this, but only on fully autonomous weapons.

Semi autonomous weapons are needed for the defense of the Northern Kingdoms.
The Northern Kingdoms
De Nordliga Riken
La Nordaj Regnoj

I use Monster Girl Encyclopedia (although set on modern time) as a medium for roleplay (my nation is not limited to it, though). I am an MT nation (set in today), with experimental and a few functioning PMT technology. My nation is when Sweden smokes much weed, takes much LSD, takes up more arms than normal, and dates a monster girl (mamono).

User avatar
The United Federation of Galaxies
Secretary
 
Posts: 33
Founded: Apr 07, 2014
Ex-Nation

Postby The United Federation of Galaxies » Sun May 10, 2015 7:27 pm

The Northern Kingdoms wrote:I support this, but only on fully autonomous weapons.

Semi autonomous weapons are needed for the defense of the Northern Kingdoms.


I wholeheartedly agree with your right to semiautonomous weapons as long as there is sentient oversight over those weapons. I'll probably revise the proposal soon to stress that.
Bernie 2016!
Policy debate is love, policy debate is life.
Pro: Climate change action, 1 world government, progressive taxation, aggressive foreign policy, right to choose, income equality.
Anti: Koch brothers, Fox News, right wing nutheads, Ted Cruz, climate deniers, agricultural subsidies.

PreviousNext

Advertisement

Remove ads

Return to General Assembly

Who is online

Users browsing this forum: No registered users

Advertisement

Remove ads