"Not once did my resolutions on war attempt to regulate types of weapons or what amounts to mistakes. I sought to regulate deliberate targeting of those who should not be targeted. There is a large difference between the two."
Advertisement
by Separatist Peoples » Sun May 10, 2015 5:09 am
by Athretvari » Sun May 10, 2015 7:26 am
by Senkaku » Sun May 10, 2015 8:25 am
by Excidium Planetis » Sun May 10, 2015 9:11 am
Terricon wrote:For, I wish to see all military and non military drones banned.
Singaporean Transhumans wrote:You didn't know about Excidium? The greatest space nomads in the NS multiverse with a healthy dose (read: over 9000 percent) of realism?
Saveyou Island wrote:"Warmest welcomes to the Assembly, ambassador. You'll soon learn to hate everyone here."
Imperium Anglorum wrote:Digital Network Defence is pretty meh
News: AI wins Dawn Fleet election for High Counselor.
by Dooom35796821595 » Sun May 10, 2015 10:56 am
by Separatist Peoples » Sun May 10, 2015 11:15 am
Dooom35796821595 wrote:There's a lot of whereas in this resolution, and the only actionable clause is Prohibiting the use and development of autonomous weapons, without defining what it would and would not cover, what a autonomous weapon is, not to mention the mentioning of humans in such a context suggests that any non human would need human authorisation to carry out any level of military operations.
Not to mention the fact that there are lots of scenarios where a cold calculating machine guided only by simple programming can be more advantageous then the hesitations and uncertainties of higher thinking beings. Like a automated turret that targets armed hostiles attempting to cross a DMZ, or point guards in urban warfare.
by Blaccakre » Sun May 10, 2015 11:40 am
Separatist Peoples wrote:Dooom35796821595 wrote:There's a lot of whereas in this resolution, and the only actionable clause is Prohibiting the use and development of autonomous weapons, without defining what it would and would not cover, what a autonomous weapon is, not to mention the mentioning of humans in such a context suggests that any non human would need human authorisation to carry out any level of military operations.
Not to mention the fact that there are lots of scenarios where a cold calculating machine guided only by simple programming can be more advantageous then the hesitations and uncertainties of higher thinking beings. Like a automated turret that targets armed hostiles attempting to cross a DMZ, or point guards in urban warfare.
"Indeed. Machines are also immune to the high-stress mental breakdowns that cause most serious war crimes (OOC L: like the My Lai massacre) that non-machines are often succeptable to. And we sure wouldn't want to lose our chemical weapons disposal droids, what with their ability to decide the best time to detonate disposal ordinances based on immediate and projected micro-meteorological data collection. As the explosive disposal charges are the same compounds we use I some conventional weapons, we'd be forced to risk lives to achieve the same ends."
by United Soviet Soc American Republics » Sun May 10, 2015 11:44 am
by Separatist Peoples » Sun May 10, 2015 11:53 am
Blaccakre wrote:Yes, murderous death machines are all well and good until that unflinching kill bot liquidates a bus full of children that fits its "threat" parameters. But, its all good, because at least well know the machine was doing what it was designed to do and not having a "high-stress mental breakdown."
by Defwa » Sun May 10, 2015 12:00 pm
Blaccakre wrote:Yes, murderous death machines are all well and good until that unflinching kill bot liquidates a bus full of children that fits its "threat" parameters. But, its all good, because at least well know the machine was doing what it was designed to do and not having a "high-stress mental breakdown."
by Blaccakre » Sun May 10, 2015 12:01 pm
Separatist Peoples wrote:Blaccakre wrote:Yes, murderous death machines are all well and good until that unflinching kill bot liquidates a bus full of children that fits its "threat" parameters. But, its all good, because at least well know the machine was doing what it was designed to do and not having a "high-stress mental breakdown."
"The appeal to emotion is a time honored, but ineffective approach, ambassador. I thought you better than that. You've completely misconstrued autonomous devices and drones in one fell swoop. A machine that, upon finding a coding error, goes into deathmode would be as dangerous to its' owners as enemies. No reasonable individual would program it thusly, or without an override or shutdown mode as an error protocol. Anything deliberately designed as such would fall afoul of Wartime Looting and Pillaging, and any violations are covered by a legal process. Care to try again with an informed argument, ambassador, or would you like to keep trying to play our heart strings like an off-key bard?"
by Blaccakre » Sun May 10, 2015 12:04 pm
Defwa wrote:Blaccakre wrote:Yes, murderous death machines are all well and good until that unflinching kill bot liquidates a bus full of children that fits its "threat" parameters. But, its all good, because at least well know the machine was doing what it was designed to do and not having a "high-stress mental breakdown."
My problem with the author and yourself is I keep hearing things that apply to humans.
Except, while a human can be a psychopath and blow up a school bus and be absolutely undetectable until the last moment, a flaw in computer programming is a little more easily hammered out.
by Separatist Peoples » Sun May 10, 2015 12:13 pm
Blaccakre wrote:I'm not talking about what happens when the kill bot malfunctions, I'm talking about what happens when it does it's job according to programming. Machines lack the capacity to make moral judgments; they just follow their programing. So if something fits all of its parameters as a threat, it's going to act on that regardless of whether a person would recognize that what looks like a threat really isn't. For example, a bus full of children hurdling through the DMZ might just be the product of an injured or confused driver. A machine programed to destroy any vehicle moving quickly into the DMZ is not going to recognize that and just going to blow shit up. By it's nature, it paints with a broad brush: does this fit my programing as a threat? If so, destroy.
This isn't an appeal to emotion, it's an appeal to common sense. We shouldn't let an robot decide who to kill because even the best programing isn't going to allow for the kind of complex judgements a moral actor can make.
Blaccakre wrote:Maybe we don't put psychopaths in charge of life and death situations either?
I'm obviously not talking about replacing kill bots with psychopaths. But at least with the psychopath there's someone who can be held accountable to justice for the wrong they do.
And if it's so easy to find computer glitches prior to them causing problems, why are there still problems associated with computer gliches. Also, militaries do tend to vet their soldiers for psychological illness.
by Defwa » Sun May 10, 2015 12:16 pm
Blaccakre wrote:Defwa wrote:My problem with the author and yourself is I keep hearing things that apply to humans.
Except, while a human can be a psychopath and blow up a school bus and be absolutely undetectable until the last moment, a flaw in computer programming is a little more easily hammered out.
Maybe we don't put psychopaths in charge of life and death situations either? I'm obviously not talking about replacing kill bots with psychopaths. But at least with the psychopath there's someone who can be held accountable to justice for the wrong they do.
And if it's so easy to find computer glitches prior to them causing problems, why are there still problems associated with computer gliches. Also, militaries do tend to vet their soldiers for psychological illness.
by The United Federation of Galaxies » Sun May 10, 2015 12:25 pm
Excidium Planetis wrote:Terricon wrote:For, I wish to see all military and non military drones banned.
Do you realize this proposal would complete destroy the capability of thousands of nations to wage war? How many non-human civilizations would not be able to defend themselves under these rules? Not even the pony nations would be able to defend themselves, as they'd need a human's decision to fire a weapon! And that is only one example of the many types of countries left defenseless under these unnecessary regulations.
by Separatist Peoples » Sun May 10, 2015 1:00 pm
The United Federation of Galaxies wrote:Excidium Planetis wrote:
Do you realize this proposal would complete destroy the capability of thousands of nations to wage war? How many non-human civilizations would not be able to defend themselves under these rules? Not even the pony nations would be able to defend themselves, as they'd need a human's decision to fire a weapon! And that is only one example of the many types of countries left defenseless under these unnecessary regulations.
I have rewritten the proposal to allow SENTIENT beings to wage war. My proposal is directed towards so called "dumb" robots, or non-sentient robots.
by Flibbleites » Sun May 10, 2015 2:12 pm
You wouldn't have to blow up the bus to stop it, for example, the machine could shoot the driver, take out the tires, take out the engine, etc. And it's also possible that said bus is in actuality a car bomb on its way to kill a bunch of people. The fact is that a machine is going to be making its judgements based on the intel it's provided. If you give it garbage intel, you're going to get garbage results. That a basic computing adage, garbage in, garbage out.Blaccakre wrote:Separatist Peoples wrote:"The appeal to emotion is a time honored, but ineffective approach, ambassador. I thought you better than that. You've completely misconstrued autonomous devices and drones in one fell swoop. A machine that, upon finding a coding error, goes into deathmode would be as dangerous to its' owners as enemies. No reasonable individual would program it thusly, or without an override or shutdown mode as an error protocol. Anything deliberately designed as such would fall afoul of Wartime Looting and Pillaging, and any violations are covered by a legal process. Care to try again with an informed argument, ambassador, or would you like to keep trying to play our heart strings like an off-key bard?"
I'm not talking about what happens when the kill bot malfunctions, I'm talking about what happens when it does it's job according to programming. Machines lack the capacity to make moral judgments; they just follow their programing. So if something fits all of its parameters as a threat, it's going to act on that regardless of whether a person would recognize that what looks like a threat really isn't. For example, a bus full of children hurdling through the DMZ might just be the product of an injured or confused driver. A machine programed to destroy any vehicle moving quickly into the DMZ is not going to recognize that and just going to blow shit up. By it's nature, it paints with a broad brush: does this fit my programing as a threat? If so, destroy.
Blaccakre wrote:This isn't an appeal to emotion, it's an appeal to common sense. We shouldn't let an robot decide who to kill because even the best programing isn't going to allow for the kind of complex judgements a moral actor can make.
by Excidium Planetis » Sun May 10, 2015 5:32 pm
The United Federation of Galaxies wrote:Excidium Planetis wrote:
Do you realize this proposal would complete destroy the capability of thousands of nations to wage war? How many non-human civilizations would not be able to defend themselves under these rules? Not even the pony nations would be able to defend themselves, as they'd need a human's decision to fire a weapon! And that is only one example of the many types of countries left defenseless under these unnecessary regulations.
I have rewritten the proposal to allow SENTIENT beings to wage war. My proposal is directed towards so called "dumb" robots, or non-sentient robots.
b. The lack of accountability if innocents are harmed or killed, and
c. The inability to feel empathy, compassion or mercy, or to factor those things into decision-making;
WHEREAS the potential harm posed by autonomous weapon systems to individuals and populations presents an unjustifiable risk to world-wide safety and security; and
WHEREAS developing autonomous weapon systems could spur an international arms race of unprecedented proportions, creating political turmoil and international strife;
Singaporean Transhumans wrote:You didn't know about Excidium? The greatest space nomads in the NS multiverse with a healthy dose (read: over 9000 percent) of realism?
Saveyou Island wrote:"Warmest welcomes to the Assembly, ambassador. You'll soon learn to hate everyone here."
Imperium Anglorum wrote:Digital Network Defence is pretty meh
News: AI wins Dawn Fleet election for High Counselor.
by Blaccakre » Sun May 10, 2015 6:42 pm
Flibbleites wrote:You wouldn't have to blow up the bus to stop it, for example, the machine could shoot the driver, take out the tires, take out the engine, etc. And it's also possible that said bus is in actuality a car bomb on its way to kill a bunch of people. The fact is that a machine is going to be making its judgements based on the intel it's provided. If you give it garbage intel, you're going to get garbage results. That a basic computing adage, garbage in, garbage out.
Flibbleites wrote:I'm sorry, but you're using "Think of the children" as your argument, that's a textbook example of an appeal to emotion.
by The United Federation of Galaxies » Sun May 10, 2015 7:05 pm
by The United Federation of Galaxies » Sun May 10, 2015 7:10 pm
Excidium Planetis wrote:The United Federation of Galaxies wrote:
I have rewritten the proposal to allow SENTIENT beings to wage war. My proposal is directed towards so called "dumb" robots, or non-sentient robots.
That goes a long way to fixing your proposal, however there are still flaws that would prevent me from voting for it:b. The lack of accountability if innocents are harmed or killed, and
c. The inability to feel empathy, compassion or mercy, or to factor those things into decision-making;
WHEREAS the potential harm posed by autonomous weapon systems to individuals and populations presents an unjustifiable risk to world-wide safety and security; and
WHEREAS developing autonomous weapon systems could spur an international arms race of unprecedented proportions, creating political turmoil and international strife;
The accountability is the accountability of the programmers. If their weapons screwed up, it is on them.
Humans can also be incapable of feeling emotion, but you do not object to them firing missiles. Additionally, mercy can be a bad thing in many cases (OOC: An American soldier had an opportunity to kill an Austrian Corporal who was limping off a battlefield in World War I. He showed the man mercy, and didn't pull the trigger. That man was Adolf Hitler, and was responsible for the deaths of millions of innocent lives).
The potential harm is less than giving guns to human operators capable of making mistakes.
It would not spur an arms race, since many nations (such as our own) already possess such weapons. It is tantamount to trying to ban assault rifles because they would spur an arms race.
Additionally, there is the problem I pointed out to you in my Telegram, that significant delays over the long distances of space would render most drone weapons obsolete, and we would have no choice but to resume manning our Bolt Class Starfighters, a move which would likely result in many more lives lost.
by Shazbotdom » Sun May 10, 2015 7:21 pm
The United Federation of Galaxies wrote:Excidium Planetis wrote:
That goes a long way to fixing your proposal, however there are still flaws that would prevent me from voting for it:
The accountability is the accountability of the programmers. If their weapons screwed up, it is on them.
Humans can also be incapable of feeling emotion, but you do not object to them firing missiles. Additionally, mercy can be a bad thing in many cases (OOC: An American soldier had an opportunity to kill an Austrian Corporal who was limping off a battlefield in World War I. He showed the man mercy, and didn't pull the trigger. That man was Adolf Hitler, and was responsible for the deaths of millions of innocent lives).
The potential harm is less than giving guns to human operators capable of making mistakes.
It would not spur an arms race, since many nations (such as our own) already possess such weapons. It is tantamount to trying to ban assault rifles because they would spur an arms race.
Additionally, there is the problem I pointed out to you in my Telegram, that significant delays over the long distances of space would render most drone weapons obsolete, and we would have no choice but to resume manning our Bolt Class Starfighters, a move which would likely result in many more lives lost.
About accountability, programming is obviously no easy task considering the billions or even trillions of lines of code needed for such a complex system. With such a complex program with so many people involved in the design and programming of the AWS, how would you enforce accountability when it is spread over a much larger group versus if a soldier goes on a rampage in a civilian population, the soldier is the one responsible?
True, humans and other sentient beings can sometimes not feel emotion but my main point is that robots INHERENTLY CANNOT feel emotion.
I suppose I underestimated how many nations have these AWS.
About the long distance communication, shouldn't faster than light devices like the ansible mitigate that by bypassing the normal space time continuum?
ShazWeb || IIWiki || Discord: shazbertbot || 1 x NFL Picks League Champion (2021)
CosmoCast || SISA || CCD || CrawDaddy || SCIA || COPEC || Boudreaux's || CLS || SNC || ShazAir || BHC || TWO
NHL: NYR 1 - 0 WSH | COL 0 - 1 WPG | VGK 0 - 0 DAL || NBA: NOLA (8) 0 - 1 OKC (1)
NCAA MBB: Tulane 22-18 | LSU 25-16 || NCAA WSB: LSU 35-10
by The Northern Kingdoms » Sun May 10, 2015 7:24 pm
by The United Federation of Galaxies » Sun May 10, 2015 7:27 pm
The Northern Kingdoms wrote:I support this, but only on fully autonomous weapons.
Semi autonomous weapons are needed for the defense of the Northern Kingdoms.
Advertisement
Users browsing this forum: No registered users
Advertisement