NATION

PASSWORD

Would an AI have a right to live?

For discussion and debate about anything. (Not a roleplay related forum; out-of-character commentary only.)

Advertisement

Remove ads

User avatar
Tlaceceyaya
Powerbroker
 
Posts: 9932
Founded: Oct 17, 2011
Left-wing Utopia

Postby Tlaceceyaya » Mon Apr 08, 2013 4:57 pm

Saubre wrote:Unless an AI suddenly feels emotions without outside influence or asks questions about its existence, then one could argue for its sentience.

Though, If the AI does manage to show emotion of any sort, then it's most likely due to the fact it was given artificially.

I can see no possibility as to how a machine of any sort would suddenly gain one day sentience by itself.

Only humans can have emotions, (Don't be a smart ass and say animals have emotions too. What I'm implying should be clear.) hence emotions are an alien concept to AI unless programmed or given by their creators.

Therefore, they can't achieve true sentience since most likely their very emotions are fake.

You act as though there is a fundamental difference between programming a robot to feel emotions and a human feeling emotions.
Last edited by Tlaceceyaya on Mon Apr 08, 2013 4:57 pm, edited 1 time in total.
Economic Left/Right -9.75, Social Libertarian/Authoritarian -8.87
Also, Bonobos.
I am a market socialist, atheist, more to come maybe at some point
Dimitri Tsafendas wrote:You are guilty not only when you commit a crime, but also when you do nothing to prevent it when you have the chance.

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Mon Apr 08, 2013 6:10 pm

Sociobiology wrote:
Aghny wrote:
I'd disagree. If you don't understand something, then you could be just as wrong as right.

so the earth continuing to orbit the sun tomorrow is exactly as likely as it stopping tomorrow, because we don't fully understand gravity?


Why so much strawmen ?

Salandriagado wrote:You are, in fact, exactly wrong. Scientific claims cannot by proven. Some philosophical ones (those of the more formal side of philosophy) can be proven.


Prove both your claims.

Grad Duchy of Luxembourg wrote:Aghny, are you going around pretending to be a professor again? :roll: tsk* tsk* tsk* Am I going to have to put you in your place again?


Any evidence to back up your claims that i am not a professor ? And if you can get me to be a veterinarian and a successful investor, i would seriously love you, not sexually though.

User avatar
New Rogernomics
Powerbroker
 
Posts: 9423
Founded: Aug 22, 2006
Left-wing Utopia

Postby New Rogernomics » Mon Apr 08, 2013 6:27 pm

Depends what you define is possible with AI, it technically is non-life (being artificial); unless it has biological components. An AI capable of sentient thought could be grounds for it having rights over its continued existence, otherwise it is a computer simulation and cannot lose a life it never had.
Last edited by New Rogernomics on Mon Apr 08, 2013 6:27 pm, edited 1 time in total.
Herald (Vice-Delegate) of Lazarus
First Citizen (PM) of Lazarus
Chocolate & Italian ice addict
"Ooh, we don't talk about Bruno, no, no, no..."
  • Former Proedroi (Minister) of Foreign Affairs of Lazarus
  • Former Lazarus Delegate (Humane Republic of Lazarus, 2015)
  • Minister of Culture & Media (Humane Republic of Lazarus)
  • Foreign Minister of The Ascendancy (RIP, and purged)
  • Senator of The Ascendancy (RIP, and purged)
  • Interior Commissioner of Lazarus (Pre-People's Republic of Lazarus)
  • At some point a member of the Grey family...then father vanished...
  • Foreign Minister of The Last Kingdom (RIP)
  • ADN:DSA Rep for Eastern Roman Empire
  • Honoratus Servant of the Holy Land (Eastern Roman Empire)
  • UN/WA Delegate of Trans Atlantice (RIP)

User avatar
Zimmer Twins
Diplomat
 
Posts: 538
Founded: Dec 06, 2012
Ex-Nation

Postby Zimmer Twins » Tue Apr 09, 2013 9:33 am

Conscentia wrote:
Zimmer Twins wrote:Hmm thats a tough question...
I think the AI would have to have at least a 65% score on the turing test in order to be able to be considered a creature seperate from being a robot.

Thy expectations are a bit high. Actual humans don't even score above 65%.
http://www.cleverbot.com/human

One sec thats the site I checked for the stats...
Ok then 63%!!!!
Left: 3.23
Libertarian: 2.43
Non Interventionalist: -1.6
Cultural Liberal: -3.87

Pro-Choice, Same Sex Marriage, Renewable Power, Space Travel, Tests on nuclear power.

SOPA/PIPA/CISPA, Pro-Life, Homophobes, Fossil Fuels.

OOC: I'm just a guy who likes video games and knows nothing about politics. Wow I am addicted to this game. 500 posts in about 4 months.

User avatar
Wamitoria
Post Marshal
 
Posts: 18852
Founded: Jun 28, 2010
Ex-Nation

Postby Wamitoria » Tue Apr 09, 2013 9:59 am

The minute it achieves sapience, it should be treated like any other sapient creature.
Wonder where all the good posters went? Look no further!

Hurry, before the Summer Nazis show up again!

User avatar
Zimmer Twins
Diplomat
 
Posts: 538
Founded: Dec 06, 2012
Ex-Nation

Postby Zimmer Twins » Tue Apr 09, 2013 9:59 am

Wamitoria wrote:The minute it achieves sapience, it should be treated like any other sapient creature.

Until it starts to kill us with neurotoxin.
Left: 3.23
Libertarian: 2.43
Non Interventionalist: -1.6
Cultural Liberal: -3.87

Pro-Choice, Same Sex Marriage, Renewable Power, Space Travel, Tests on nuclear power.

SOPA/PIPA/CISPA, Pro-Life, Homophobes, Fossil Fuels.

OOC: I'm just a guy who likes video games and knows nothing about politics. Wow I am addicted to this game. 500 posts in about 4 months.

User avatar
The Rich Port
Post Czar
 
Posts: 38094
Founded: Jul 29, 2008
Ex-Nation

Postby The Rich Port » Tue Apr 09, 2013 10:17 am

Morganutopia wrote:
The Rich Port wrote:
... Dumb post, or THE DUMBEST post?

why is it Dumb well kill all the time.
we do not give the " right to live?" to all humans so why give it to a AI.


There's a reason murder is illegal in most countries.

SOME people may not respect a person's right to life, but that's because said person is most likely a criminal.

Zimmer Twins wrote:Until it starts to kill us with neurotoxin


If we don't treat it like a person, IMO, it would be more likely to try to kill us.
Last edited by The Rich Port on Tue Apr 09, 2013 10:26 am, edited 2 times in total.

User avatar
Zimmer Twins
Diplomat
 
Posts: 538
Founded: Dec 06, 2012
Ex-Nation

Postby Zimmer Twins » Tue Apr 09, 2013 10:18 am

The Rich Port wrote:
Morganutopia wrote:why is it Dumb well kill all the time.
we do not give the " right to live?" to all humans so why give it to a AI.


There's a reason murder is illegal in most countries.

SOME people may not respect a person's right to life, but that's because said person is most likely a criminal.

Zimmer Twins wrote:
If we don't treat it like a person, IMO, it would be more likely to try to kill us.
Until it starts to kill us with neurotoxin.

I was refrencing a game.
Left: 3.23
Libertarian: 2.43
Non Interventionalist: -1.6
Cultural Liberal: -3.87

Pro-Choice, Same Sex Marriage, Renewable Power, Space Travel, Tests on nuclear power.

SOPA/PIPA/CISPA, Pro-Life, Homophobes, Fossil Fuels.

OOC: I'm just a guy who likes video games and knows nothing about politics. Wow I am addicted to this game. 500 posts in about 4 months.

User avatar
Conscentia
Postmaster of the Fleet
 
Posts: 26681
Founded: Feb 04, 2011
Ex-Nation

Postby Conscentia » Tue Apr 09, 2013 11:06 am

Zimmer Twins wrote:
Wamitoria wrote:The minute it achieves sapience, it should be treated like any other sapient creature.

Until it starts to kill us with neurotoxin.

Er, why should it be treated differently to any other sapient if it tries to kill us with neurotoxin?
If a sapient entity decides to kill us with neurotoxin, we try to neutralise that entity.
Rights for GLaDOS! :p
Last edited by Conscentia on Tue Apr 09, 2013 11:07 am, edited 1 time in total.

User avatar
Camicon
Postmaster-General
 
Posts: 14377
Founded: Aug 26, 2010
Ex-Nation

Postby Camicon » Tue Apr 09, 2013 11:06 am

Zimmer Twins wrote:
Wamitoria wrote:The minute it achieves sapience, it should be treated like any other sapient creature.

Until it starts to kill us with neurotoxin.

If it starts to kill us with neurotoxin, we should still treat it like we would any other sapient being, i.e. killing it.
Hey/They
Active since May, 2009
Country of glowing hearts, and patrons of the arts
Help me out
Star spangled madness, united sadness
Count me out
The Trews, Under The Sun
No human is more human than any other. - Lieutenant-General Roméo Antonius Dallaire
Don't shine for swine. - Metric, Soft Rock Star
Love is hell. Hell is love. Hell is asking to be loved. - Emily Haines and the Soft Skeleton, Detective Daughter

Why (Male) Rape Is Hilarious [because it has to be]

User avatar
The Emerald Legion
Postmaster-General
 
Posts: 10695
Founded: Mar 18, 2011
Father Knows Best State

Postby The Emerald Legion » Tue Apr 09, 2013 1:06 pm

Phocidaea wrote:I hope not.

I'd pull the plug on the thing the millisecond it decided to do something stupid. Don't want any GLaDOSes or HAL 9000s.


And you're going to get past the angry mobs of Singularity believers how? Be a tough guy and kill them too? I doubt it.

Saubre wrote:Unless an AI suddenly feels emotions without outside influence or asks questions about its existence, then one could argue for its sentience.

Though, If the AI does manage to show emotion of any sort, then it's most likely due to the fact it was given artificially.

I can see no possibility as to how a machine of any sort would suddenly gain one day sentience by itself.

Only humans can have emotions, (Don't be a smart ass and say animals have emotions too. What I'm implying should be clear.) hence emotions are an alien concept to AI unless programmed or given by their creators.

Therefore, they can't achieve true sentience since most likely their very emotions are fake.


And what weight do emotions have? They are nothing more than a primitive system to assign priorities. Eating feels good, so you do it. Jumping into a thornbush feels bad... so you don't.

In any case, I just love how the Luddites and Bio-Chauvinists just assume they'd stroll in and destroy it with no resistance. For the most part, the people working on this sort of thing are the types who believe it would be of benefit to have it around.

Say you built the best car ever, someone walks in and starts trying to smash it with a hammer and when you object they say "It's a threat to humanity!" Are you going to just go "Oh... well that's ok then." No. You'd call security and have the loonie hauled away. The same would happen here.

Or even more likely... they wouldn't make it in to begin with.
Last edited by The Emerald Legion on Tue Apr 09, 2013 1:12 pm, edited 1 time in total.
"23.The unwise man is awake all night, and ponders everything over; when morning comes he is weary in mind, and all is a burden as ever." - Havamal

User avatar
Genivaria
Khan of Spam
 
Posts: 69790
Founded: Mar 29, 2011
Iron Fist Consumerists

Postby Genivaria » Tue Apr 09, 2013 1:12 pm

Freedom is the right of all Sentient Beings. -Optimus Prime.

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Tue Apr 09, 2013 6:31 pm

The Emerald Legion wrote:
Phocidaea wrote:I hope not.

I'd pull the plug on the thing the millisecond it decided to do something stupid. Don't want any GLaDOSes or HAL 9000s.


And you're going to get past the angry mobs of Singularity believers how? Be a tough guy and kill them too? I doubt it.

Saubre wrote:Unless an AI suddenly feels emotions without outside influence or asks questions about its existence, then one could argue for its sentience.

Though, If the AI does manage to show emotion of any sort, then it's most likely due to the fact it was given artificially.

I can see no possibility as to how a machine of any sort would suddenly gain one day sentience by itself.

Only humans can have emotions, (Don't be a smart ass and say animals have emotions too. What I'm implying should be clear.) hence emotions are an alien concept to AI unless programmed or given by their creators.

Therefore, they can't achieve true sentience since most likely their very emotions are fake.


And what weight do emotions have? They are nothing more than a primitive system to assign priorities. Eating feels good, so you do it. Jumping into a thornbush feels bad... so you don't.

In any case, I just love how the Luddites and Bio-Chauvinists just assume they'd stroll in and destroy it with no resistance. For the most part, the people working on this sort of thing are the types who believe it would be of benefit to have it around.


Say you built the best car ever, someone walks in and starts trying to smash it with a hammer and when you object they say "It's a threat to humanity!" Are you going to just go "Oh... well that's ok then." No. You'd call security and have the loonie hauled away. The same would happen here.

Or even more likely... they wouldn't make it in to begin with.


Well one can say that majority of the people aren't the ones working on something like this, nor do they support it. So unless the government and the army itself is providing protection, it is not going to be too hard. Then again, there is the matter of other countries.

Take the case of nukes for instant.

There might be collateral damage ofcourse, but sometimes a few have to die for the benefit of millions of others.

User avatar
The Zeonic States
Postmaster-General
 
Posts: 12078
Founded: Jul 29, 2012
Ex-Nation

Postby The Zeonic States » Tue Apr 09, 2013 6:47 pm

Genivaria wrote:Freedom is the right of all Sentient Beings. -Optimus Prime.


"Humanity is the only pure expression of sentience" -Emperor Palpatine
National Imperialist-Freedom Party

Proud member of the stone wall alliance

Agent Maine: of NSG's Official Project Freelancer

[Fires of the Old Republic Role Play]http://forum.nationstates.net/viewtopic.php?f=31&t=239203

User avatar
Sociobiology
Post Marshal
 
Posts: 18396
Founded: Aug 18, 2010
Ex-Nation

Postby Sociobiology » Tue Apr 09, 2013 7:01 pm

Aghny wrote:
Sociobiology wrote:so the earth continuing to orbit the sun tomorrow is exactly as likely as it stopping tomorrow, because we don't fully understand gravity?


Why so much strawmen ?

no strawman you are confusing not fully understanding every aspect of something with not understanding anything about it.

Salandriagado wrote:You are, in fact, exactly wrong. Scientific claims cannot by proven. Some philosophical ones (those of the more formal side of philosophy) can be proven.


Prove both your claims.


better yet prove inductive reasoning using induction.
I think we risk becoming the best informed society that has ever died of ignorance. ~Reuben Blades

I got quite annoyed after the Haiti earthquake. A baby was taken from the wreckage and people said it was a miracle. It would have been a miracle had God stopped the earthquake. More wonderful was that a load of evolved monkeys got together to save the life of a child that wasn't theirs. ~Terry Pratchett

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Tue Apr 09, 2013 7:08 pm

Sociobiology wrote:
Aghny wrote:
Why so much strawmen ?

no strawman you are confusing not fully understanding every aspect of something with not understanding anything about it.


Prove both your claims.


better yet prove inductive reasoning using induction.


And more and more strawmen...

http://en.wikipedia.org/wiki/Straw_man
Last edited by Aghny on Tue Apr 09, 2013 7:09 pm, edited 1 time in total.

User avatar
Sociobiology
Post Marshal
 
Posts: 18396
Founded: Aug 18, 2010
Ex-Nation

Postby Sociobiology » Tue Apr 09, 2013 7:18 pm

Akitawnui wrote:
Sociobiology wrote:but novel reasoning cannot, and highly variable reasoning cannot be at a functional level, because said machine will run out of time in the universe before it runs out of possibilities.


I agree but that machine can refer to the human brain as well

but human brains don't do that, in fact they rarely produce more than seven contingencies, the brain can't handle more than that. The brain works because it is imprecise, and has a lot of assumptions built in. violate to many of those and it does much worse than a computer.
In fact the thing the brain is worst at is novel situations, but it still works better than any computer, at least for medium sized things on planets moving at medium speeds.


there is a really odd brain injury that produce that effect, were the person will sit at an intersection until their car runs out of gas because they are trying to compare every variable about two different routes, they can just go "fuck it" and pick one route based on one or two variables.

and that last step is the one an AI will need.
I think we risk becoming the best informed society that has ever died of ignorance. ~Reuben Blades

I got quite annoyed after the Haiti earthquake. A baby was taken from the wreckage and people said it was a miracle. It would have been a miracle had God stopped the earthquake. More wonderful was that a load of evolved monkeys got together to save the life of a child that wasn't theirs. ~Terry Pratchett

User avatar
Sociobiology
Post Marshal
 
Posts: 18396
Founded: Aug 18, 2010
Ex-Nation

Postby Sociobiology » Tue Apr 09, 2013 7:21 pm

Aghny wrote:
Sociobiology wrote:no strawman you are confusing not fully understanding every aspect of something with not understanding anything about it.



better yet prove inductive reasoning using induction.


And more and more strawmen...

http://en.wikipedia.org/wiki/Straw_man

yes read your source, then read your own statment

If you don't understand something, then you could be just as wrong as right.


this is a false statement, based on a false dichotomy, and an ignorance of probability.
pointing that out by moving the exact same logic to a different situation IS NOT a strawman.
I think we risk becoming the best informed society that has ever died of ignorance. ~Reuben Blades

I got quite annoyed after the Haiti earthquake. A baby was taken from the wreckage and people said it was a miracle. It would have been a miracle had God stopped the earthquake. More wonderful was that a load of evolved monkeys got together to save the life of a child that wasn't theirs. ~Terry Pratchett

User avatar
Bralia
Post Czar
 
Posts: 31250
Founded: Mar 07, 2010
Democratic Socialists

Postby Bralia » Tue Apr 09, 2013 7:23 pm

The Emerald Legion wrote:
Saubre wrote:Unless an AI suddenly feels emotions without outside influence or asks questions about its existence, then one could argue for its sentience.

Though, If the AI does manage to show emotion of any sort, then it's most likely due to the fact it was given artificially.

I can see no possibility as to how a machine of any sort would suddenly gain one day sentience by itself.

Only humans can have emotions, (Don't be a smart ass and say animals have emotions too. What I'm implying should be clear.) hence emotions are an alien concept to AI unless programmed or given by their creators.

Therefore, they can't achieve true sentience since most likely their very emotions are fake.


And what weight do emotions have? They are nothing more than a primitive system to assign priorities. Eating feels good, so you do it. Jumping into a thornbush feels bad... so you don't.

In any case, I just love how the Luddites and Bio-Chauvinists just assume they'd stroll in and destroy it with no resistance. For the most part, the people working on this sort of thing are the types who believe it would be of benefit to have it around.

Say you built the best car ever, someone walks in and starts trying to smash it with a hammer and when you object they say "It's a threat to humanity!" Are you going to just go "Oh... well that's ok then." No. You'd call security and have the loonie hauled away. The same would happen here.

Or even more likely... they wouldn't make it in to begin with.

A minor correction: We don't eat because it "feels good", we eat because our biological platforms require energy.
Romantic slut. Self-deprecating egotist. Benevolent communist.

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Tue Apr 09, 2013 7:25 pm

Sociobiology wrote:this is a false statement, based on a false dichotomy, and an ignorance of probability.
pointing that out by moving the exact same logic to a different situation IS NOT a strawman.


1) Probability is a chance percentage. Just because you have 99.9999...% of being right doesn't mean that you couldn't be wrong either. If from the wording of my post, you deduced that i was implying there was a 50-50 chance, then to clear it up i wasn't.

2) Read the definition of straman again. If you still don't get it, any argument against it that i make won't be of much use.
Last edited by Aghny on Tue Apr 09, 2013 7:27 pm, edited 1 time in total.

User avatar
Occupied Deutschland
Post Marshal
 
Posts: 18796
Founded: Oct 01, 2010
Ex-Nation

Postby Occupied Deutschland » Tue Apr 09, 2013 7:26 pm

Genivaria wrote:Freedom is the right of all Sentient Beings. -Optimus Prime.

“It’s just such ignorance which forever relegates you to the ranks of underling.” -Megatron :p

Sure a truly sentient/self-aware/etc. AI would have a right to live.
Now we just need to figure out where that point is. Hmmm...Perhaps if we built a computer to calculate it...
I'm General Patton.
Even those who are gone are with us as we go on.

Been busy lately--not around much.

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Tue Apr 09, 2013 7:28 pm

Occupied Deutschland wrote:
Genivaria wrote:Freedom is the right of all Sentient Beings. -Optimus Prime.

“It’s just such ignorance which forever relegates you to the ranks of underling.” -Megatron :p

Sure a truly sentient/self-aware/etc. AI would have a right to live.
Now we just need to figure out where that point is. Hmmm...Perhaps if we built a computer to calculate it...


It would result in the answer being 42 and mock our lack of intelligence to ask the right question.

User avatar
Occupied Deutschland
Post Marshal
 
Posts: 18796
Founded: Oct 01, 2010
Ex-Nation

Postby Occupied Deutschland » Tue Apr 09, 2013 7:31 pm

Aghny wrote:
Occupied Deutschland wrote:“It’s just such ignorance which forever relegates you to the ranks of underling.” -Megatron :p

Sure a truly sentient/self-aware/etc. AI would have a right to live.
Now we just need to figure out where that point is. Hmmm...Perhaps if we built a computer to calculate it...


It would result in the answer being 42 and mock our lack of intelligence to ask the right question.

Talk to any computer engineer and they'll probably say that second part is already happening :lol: .
I'm General Patton.
Even those who are gone are with us as we go on.

Been busy lately--not around much.

User avatar
Rainbows and Rivers
Diplomat
 
Posts: 803
Founded: Mar 29, 2011
Ex-Nation

Postby Rainbows and Rivers » Tue Apr 09, 2013 7:32 pm

Sociobiology wrote:there is a really odd brain injury that produce that effect, were the person will sit at an intersection until their car runs out of gas because they are trying to compare every variable about two different routes, they can just go "fuck it" and pick one route based on one or two variables.

and that last step is the one an AI will need.


Yeah, this is already a staple of AI programming. I myself programmed a very simple application for coloring the map of the united states which could dynamically decide on the best state to color next and what color to make it.

Withing the potential values of different outcomes isn't the last step of AI programming - it's the first.

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Tue Apr 09, 2013 7:32 pm

Occupied Deutschland wrote:
Aghny wrote:
It would result in the answer being 42 and mock our lack of intelligence to ask the right question.

Talk to any computer engineer and they'll probably say that second part is already happening :lol: .


True :lol2:

PreviousNext

Advertisement

Remove ads

Return to General

Who is online

Users browsing this forum: Continental Free States, Democratic Poopland, EuroStralia, Luminerra, Perchan, Senkaku

Advertisement

Remove ads