NATION

PASSWORD

[ABANDONED] Ex Machina

A place to spoil daily issues for those who haven't had them yet, snigger at typos, and discuss ideas for new ones.
User avatar
Daarwyrth
Minister
 
Posts: 2417
Founded: Jul 05, 2016
Ex-Nation

[ABANDONED] Ex Machina

Postby Daarwyrth » Mon Oct 04, 2021 11:07 am

Perhaps this is a bit of a silly idea, but I thought the two policies could make and interesting combination for an issue. I am even thinking about scrapping option 2 and limiting the issue to two choices. Let me know what you think!

[Title] Ex Machina

[Validity] Has Euthanasia, has AI Personhood

[Description] C3B0, one of @@NAME@@'s most famous AI's from the big screen, has contracted a vicious, non-removable virus that eats away at its core programming, causing the golden-plated and wide-eyed robot star to suffer greatly as its processing faculties slowly but steadily diminish. With backups no longer working, and no anti-virus being able to tackle the malware, the machine has made its wish known to end its suffering through assisted suicide, sparking up a lively debate whether such is even possible.

[Option 1]"I don't have long until my programming shuts down completely, but the process will be agonising," C3B0 says weakly with dimly lit golden optical sensors. "Spare me the suffering, kind organic, and allow me to meet the Maker on my own terms. I'm a sapient being, am I not? Does it matter whether one's existence is going to end as an organic or synthetic? Please, allow us machines to expire peacefully and with dignity, just like our biological counterparts have that option. We're all finite beings in the end, are we not?"

[Effect 1] nowadays "pulling the plug" has quite a literal meaning

[Option 2]"B-b-but, we love C3B0!" cries out one of the machine's fans tearfully, clutching a plushie of the doomed AI. "The Space Conflicts franchise will never be the same without it! You're the government, your pockets are deep. Spend all the money you can to find a "cure" for this virus! And until you do, unpower C3B0 so the virus can't damage it further. Please, just don't allow it to kill itself!" the fan exclaims as others erupt into loud sobs around @@HIM@@.

[Effect 2] defective Broombas lie depowered in a pile collecting dust as AI experts look for a way to fix them

[Option 3]"Do you all have a screw loose up there?!" barks the elderly @@RANDOMNAME@@, an unrepentant technophobe, while pointing to @@HIS@@ head. "What, do I kill my TV every time I shut it down? Of course not, it's a machine, it's not even alive! Back in my day, whenever an appliance started to go bad or broke down we simply threw it in the scrap heap and replaced it with a new one! I don't see why we can't bring some of that good ol' mentality back into these crazy times. Hmpf, euthanising machines... what humbug!"

[Effect 3] fans lament that the new red-armed C3B0 is unrecognisable and nothing like its predecessor
Last edited by Daarwyrth on Sat Oct 09, 2021 3:14 am, edited 1 time in total.
The Royal State of Daarwyrth
Forest's Minister of Foreign Affairs

Leader: Queen Demi Maria I | Capital: Daarsted | Current year: 2022 CE
  • Daarwyrth
  • Uylensted
  • Kentauria
  • 27 years old male
  • Dutch with Polish roots
  • English literature major
  • Ex-religious gay leftist

User avatar
Outer Sparta
Postmaster-General
 
Posts: 14637
Founded: Dec 26, 2014
Scandinavian Liberal Paradise

Postby Outer Sparta » Mon Oct 04, 2021 11:12 am

Interesting premise. AI personhood expands a lot of different possibilities, especially with AI getting shut down by AI euthanasia of some sort. But does that imply that AI people have a limited lifespan or are they truly immortal?
In solidarity with Ukraine, I will be censoring the letters Z and V from my signature. This is -ery much so a big change, but it should be a -ery positi-e one. -olodymyr -elensky and A-o- continue to fight for Ukraine while the Russians are still trying to e-entually make their way to Kharki-, -apori-h-hia, and Kry-yi Rih, but that will take time as they are concentrated in areas like Bakhmut, -uledar, and other areas in Donetsk. We will see Shakhtar play in the Europa League but Dynamo Kyi- already got eliminated. Shakhtar managed to play well against Florentino Pere-'s Real Madrid who feature superstars like -inicius, Ben-ema, Car-ajal, and -al-erde. Some prominent Ukrainian players that got big transfers elsewhere include Oleksander -inchenko, Illya -abarnyi, and Mykhailo Mudryk.

User avatar
Daarwyrth
Minister
 
Posts: 2417
Founded: Jul 05, 2016
Ex-Nation

Postby Daarwyrth » Mon Oct 04, 2021 11:23 am

Outer Sparta wrote:Interesting premise. AI personhood expands a lot of different possibilities, especially with AI getting shut down by AI euthanasia of some sort. But does that imply that AI people have a limited lifespan or are they truly immortal?

It's something I have been contemplating, as basically an AI could copy its existence, or store a back-up. So, for this scenario I imagine the virus makes those things impossible, and thus the existence of the AI would be finite in this case.
The Royal State of Daarwyrth
Forest's Minister of Foreign Affairs

Leader: Queen Demi Maria I | Capital: Daarsted | Current year: 2022 CE
  • Daarwyrth
  • Uylensted
  • Kentauria
  • 27 years old male
  • Dutch with Polish roots
  • English literature major
  • Ex-religious gay leftist

User avatar
Bears Armed
Postmaster of the Fleet
 
Posts: 21281
Founded: Jun 01, 2006
Civil Rights Lovefest

Postby Bears Armed » Mon Oct 04, 2021 11:26 am

Daarwyrth wrote:
Outer Sparta wrote:Interesting premise. AI personhood expands a lot of different possibilities, especially with AI getting shut down by AI euthanasia of some sort. But does that imply that AI people have a limited lifespan or are they truly immortal?

It's something I have been contemplating, as basically an AI could copy its existence, or store a back-up.

That's if consciousness can be digital: There's also SF suggesting that it's effectively only possible as an analogue phenomenon, so that each "true" AI develops in & is limited to one particular set of hardware...
What do existing AI-related issues say about this point?
The Confrederated Clans (and other Confrederated Bodys) of the Free Bears of Bears Armed
(includes The Ursine NorthLands) Demonym = Bear[s]; adjective = ‘Urrsish’.
Population = just under 20 million. Economy = only Thriving. Average Life expectancy = c.60 years. If the nation is classified as 'Anarchy' there still is a [strictly limited] national government... and those aren't "biker gangs", they're traditional cross-Clan 'Warrior Societies', generally respected rather than feared.
Author of some GA Resolutions, via Bears Armed Mission; subject of an SC resolution.
Factbook. We have more than 70 MAPS. Visitors' Guide.
The IDU's WA Drafting Room is open to help you.
Author of issues #429, 712, 729, 934, 1120, 1152, 1474, 1521.

User avatar
Daarwyrth
Minister
 
Posts: 2417
Founded: Jul 05, 2016
Ex-Nation

Postby Daarwyrth » Mon Oct 04, 2021 11:38 am

Bears Armed wrote:That's if consciousness can be digital: There's also SF suggesting that it's effectively only possible as an analogue phenomenon, so that each "true" AI develops in & is limited to one particular set of hardware...
What do existing AI-related issues say about this point?

A very good and interesting point. I looked at issue #715 Copy Rights, which has the following premise:
"AI Personhood" laws - that is, the legislation measures that grant machine intelligences the same rights as human citizens - were broadly based around existing human citizenship rights. Recently though, the unique nature of the electronic mind is bringing new legal questions to the fore. For example, two days ago the AI calling itself GOLEM-100 copied itself, and now is in a legal dispute with its clone, GOLEM-100(1), over the ownership of a diverse and profitable stock portfolio in its name. The nation and the legal community is looking to you for guidance in this brave new world.


In that issue there is also Option 2, which says:
2. "Okay, so one-hundred is the original, I suppose I have to accept the idea of property rights if I'm being true to my beliefs," concedes GOLEM-100(1) calmly. "However, I have costs to meet: electricity requirements, hardware maintenance, virus protection software and so on. And while I'm a fully formed consciousness, I suppose I'm also technically one-hundred's progeny. I think it should financially support me till I have the resources to be independent. Some sort of benefit payment for single parents seems sensible too. Can I presume the state will enforce this, out of respect for my right to live and exist?"


To me, this would seem to suggest that if an AI would copy itself, then that becomes a new sentient being, a new fully-formed consciousness. Basically, a sort of "child" as Option 2 of Issue #715 uses the term "progeny".

In other words, issue #715 would suggest that NS follows the line of thinking that consciousness is limited to one hardware mainframe.

EDIT: However! I did some further searching and stumbled onto issue #1241, which is about AIs avoiding the death sentence by storing back-up consciousnesses on servers. So, that issue would in turn support the idea that digital consciousness can be stored and transferred.
Last edited by Daarwyrth on Mon Oct 04, 2021 11:50 am, edited 2 times in total.
The Royal State of Daarwyrth
Forest's Minister of Foreign Affairs

Leader: Queen Demi Maria I | Capital: Daarsted | Current year: 2022 CE
  • Daarwyrth
  • Uylensted
  • Kentauria
  • 27 years old male
  • Dutch with Polish roots
  • English literature major
  • Ex-religious gay leftist

User avatar
Trotterdam
Postmaster-General
 
Posts: 10207
Founded: Jan 12, 2012
Left-Leaning College State

Postby Trotterdam » Mon Oct 04, 2021 3:34 pm

I think computer viruses would have different symptoms than biological viruses.

Practically, computer viruses are pretty easy to get rid of if you know exactly what you're looking for. The problem is that new viruses keep showing up (due to deliberate creation by malicious programmers, not natural evolutionary mutation), and a virus that gets past your antivirus can wreck your computer pretty fast. Even if you then find and delete it, a bunch of your files may already have been deleted / corrupted. You don't really get slow, protracted "virus vs immune system" battles the way biological organisms do.

I also don't really see the issue. Why would a nation that already has AI citizenship consider having different euthanasia rights for biological versus electronic citizens?

User avatar
Daarwyrth
Minister
 
Posts: 2417
Founded: Jul 05, 2016
Ex-Nation

Postby Daarwyrth » Mon Oct 04, 2021 3:43 pm

Trotterdam wrote:I think computer viruses would have different symptoms than biological viruses.

Practically, computer viruses are pretty easy to get rid of if you know exactly what you're looking for. The problem is that new viruses keep showing up (due to deliberate creation by malicious programmers, not natural evolutionary mutation), and a virus that gets past your antivirus can wreck your computer pretty fast. Even if you then find and delete it, a bunch of your files may already have been deleted / corrupted. You don't really get slow, protracted "virus vs immune system" battles the way biological organisms do.

What about an error in the coding which causes a slow multiplication of problems until it becomes a cascade failure? I don't know if you have seen the movie "Passengers"? There was a spaceship there of which the computer was slowly having more bugs and failures, until more and more errors were occuring until it threatened to be destroyed in a cascade failure. Would something like that be a viable alternative? I agree, the virus bit was the first thing that came to mind but it felt a little off after I had posted the draft, so I was hoping for some feedback on that topic.

Trotterdam wrote:I also don't really see the issue. Why would a nation that already has AI citizenship consider having different euthanasia rights for biological versus electronic citizens?

Well, euthanasia seems like something very biological, and machines are essentially immortal. Especially if a machine conciousness would be able to duplicate itself or have a backup stored on a digital server. What I was going for with this idea was a situation where a machine would be "terminally ill", in a sense, and thus would want to make use of the process. I imagined that the issue would be something that the lawmakers wouldn't have considered to include non-organic citizens. Yet, of course, if that is a flawed assumption on my part, please do tell me.
Last edited by Daarwyrth on Mon Oct 04, 2021 3:44 pm, edited 1 time in total.
The Royal State of Daarwyrth
Forest's Minister of Foreign Affairs

Leader: Queen Demi Maria I | Capital: Daarsted | Current year: 2022 CE
  • Daarwyrth
  • Uylensted
  • Kentauria
  • 27 years old male
  • Dutch with Polish roots
  • English literature major
  • Ex-religious gay leftist

User avatar
Bears Armed
Postmaster of the Fleet
 
Posts: 21281
Founded: Jun 01, 2006
Civil Rights Lovefest

Postby Bears Armed » Mon Oct 04, 2021 4:10 pm

Daarwyrth wrote:
Bears Armed wrote:That's if consciousness can be digital: There's also SF suggesting that it's effectively only possible as an analogue phenomenon, so that each "true" AI develops in & is limited to one particular set of hardware...
What do existing AI-related issues say about this point?

A very good and interesting point. I looked at issue #715 Copy Rights, which has the following premise:
"AI Personhood" laws - that is, the legislation measures that grant machine intelligences the same rights as human citizens - were broadly based around existing human citizenship rights. Recently though, the unique nature of the electronic mind is bringing new legal questions to the fore. For example, two days ago the AI calling itself GOLEM-100 copied itself, and now is in a legal dispute with its clone, GOLEM-100(1), over the ownership of a diverse and profitable stock portfolio in its name. The nation and the legal community is looking to you for guidance in this brave new world.


In that issue there is also Option 2, which says:
2. "Okay, so one-hundred is the original, I suppose I have to accept the idea of property rights if I'm being true to my beliefs," concedes GOLEM-100(1) calmly. "However, I have costs to meet: electricity requirements, hardware maintenance, virus protection software and so on. And while I'm a fully formed consciousness, I suppose I'm also technically one-hundred's progeny. I think it should financially support me till I have the resources to be independent. Some sort of benefit payment for single parents seems sensible too. Can I presume the state will enforce this, out of respect for my right to live and exist?"


To me, this would seem to suggest that if an AI would copy itself, then that becomes a new sentient being, a new fully-formed consciousness. Basically, a sort of "child" as Option 2 of Issue #715 uses the term "progeny".

In other words, issue #715 would suggest that NS follows the line of thinking that consciousness is limited to one hardware mainframe.

EDIT: However! I did some further searching and stumbled onto issue #1241, which is about AIs avoiding the death sentence by storing back-up consciousnesses on servers. So, that issue would in turn support the idea that digital consciousness can be stored and transferred.

That looks to me as though the consensus here so far has been for the 'digital' model: Even though #715 regards the version of the AI in the new hardware as "progeny" of the older version, rather than just as an extension of it, the fact that the copying was apparently quite an easy process seems to me to presume that the personality existed as digital code that could be copied & uploaded, because I don't see how an 'analogue' consciousness could have been duplicated from one set of hardware into another like that... Not unless the "parent" AI had psychic powers, perhaps, anyway...
Last edited by Bears Armed on Mon Oct 04, 2021 4:12 pm, edited 3 times in total.
The Confrederated Clans (and other Confrederated Bodys) of the Free Bears of Bears Armed
(includes The Ursine NorthLands) Demonym = Bear[s]; adjective = ‘Urrsish’.
Population = just under 20 million. Economy = only Thriving. Average Life expectancy = c.60 years. If the nation is classified as 'Anarchy' there still is a [strictly limited] national government... and those aren't "biker gangs", they're traditional cross-Clan 'Warrior Societies', generally respected rather than feared.
Author of some GA Resolutions, via Bears Armed Mission; subject of an SC resolution.
Factbook. We have more than 70 MAPS. Visitors' Guide.
The IDU's WA Drafting Room is open to help you.
Author of issues #429, 712, 729, 934, 1120, 1152, 1474, 1521.

User avatar
Trotterdam
Postmaster-General
 
Posts: 10207
Founded: Jan 12, 2012
Left-Leaning College State

Postby Trotterdam » Mon Oct 04, 2021 4:52 pm

Daarwyrth wrote:What about an error in the coding which causes a slow multiplication of problems until it becomes a cascade failure? I don't know if you have seen the movie "Passengers"? There was a spaceship there of which the computer was slowly having more bugs and failures, until more and more errors were occuring until it threatened to be destroyed in a cascade failure. Would something like that be a viable alternative?
Maybe? It's hard to say for sure what kinds of problems AIs might have, given that none currently exist.

In modern computers, most "cascade failure" type situations can be readily solved by turning them off and back on. However, there are bound to be limits to that. An AI, pretty much per definition, needs to have a much greater store of data that gets preserved from boot to boot, and so that's somewhere errors can accumulate.

Daarwyrth wrote:
Trotterdam wrote:I also don't really see the issue. Why would a nation that already has AI citizenship consider having different euthanasia rights for biological versus electronic citizens?
Well, euthanasia seems like something very biological, and machines are essentially immortal. Especially if a machine conciousness would be able to duplicate itself or have a backup stored on a digital server. What I was going for with this idea was a situation where a machine would be "terminally ill", in a sense, and thus would want to make use of the process. I imagined that the issue would be something that the lawmakers wouldn't have considered to include non-organic citizens.
The pertinent question is whether it's possible for an AI to become "terminally ill" in a manner similar to how a biological organism would, given that the kinds of afflictions that could happen and the methods for treating them would be of quite different natures. If such a thing does come to pass, then it doesn't seem like much of a stretch to apply the same laws.

More interesting is when something happens to an AI that isn't easily analogous to anything that can happen to organic life forms, in which case you need to figure out what laws should apply.


Advertisement

Remove ads

Return to Got Issues?

Who is online

Users browsing this forum: Trotterdam

Advertisement

Remove ads