NATION

PASSWORD

Would an AI have a right to live?

For discussion and debate about anything. (Not a roleplay related forum; out-of-character commentary only.)

Advertisement

Remove ads

User avatar
Sociobiology
Post Marshal
 
Posts: 18396
Founded: Aug 18, 2010
Ex-Nation

Postby Sociobiology » Mon Apr 08, 2013 3:00 pm

Aghny wrote:
Sociobiology wrote:
rights are not objective
nor is law or justice


Neither did i say they were.

but nor are they pure opinion, they are consensus.
I think we risk becoming the best informed society that has ever died of ignorance. ~Reuben Blades

I got quite annoyed after the Haiti earthquake. A baby was taken from the wreckage and people said it was a miracle. It would have been a miracle had God stopped the earthquake. More wonderful was that a load of evolved monkeys got together to save the life of a child that wasn't theirs. ~Terry Pratchett

User avatar
Yankee Empire
Senator
 
Posts: 4186
Founded: Aug 01, 2012
Ex-Nation

Postby Yankee Empire » Mon Apr 08, 2013 3:01 pm

Sociobiology wrote:
Aghny wrote:
Neither did i say they were.

but nor are they pure opinion, they are consensus.

Consensus of Opinion, that said I think philosophically some opinions can be show to be more valid than others.
Economic Left/Right: -6.50
Social Libertarian/Authoritarian: 2.05


Pro: U.S.,Diplomatic Militarism, Imperialism, Patriotism/Civic Nationalism, Cosmopolitanism, Stoicism, Authoritarianism, Classical Liberalism, Unionism, Centralization (usually), Federalism, Corporatism.
Anti:Tribalism, Seccessionism(usually),Decentralization,Pure Capitalism/State controlled economics, Misanthropy,Cruelty, Cowardice, Pacifism,Hedonism, Corporitocracy.
Vice-Chairman of the National-Imperialist-FreedomParty
"My country, right or wrong; if right, to be kept right; and if wrong, to be set right."-Carl Schurz

User avatar
Crumlark
Ambassador
 
Posts: 1809
Founded: Jul 08, 2011
Ex-Nation

Postby Crumlark » Mon Apr 08, 2013 3:02 pm

Saubre wrote:Unless an AI suddenly feels emotions without outside influence or asks questions about its existence, then one could argue for its sentience.

Though, If the AI does manage to show emotion of any sort, then it's most likely due to the fact it was given artificially.

I can see no possibility as to how a machine of any sort would suddenly gain one day sentience by itself.

Only humans can have emotions, (Don't be a smart ass and say animals have emotions too. What I'm implying should be clear.) hence emotions are an alien concept to AI unless programmed or given by their creators.

Therefore, they can't achieve true sentience since most likely their very emotions are fake.

From when I was little, I did not know what sadness was. I noticed negative energy, and did not understand it. Eventually, my parents took me aside, and explained it to me. Did that make me less human, that I had to be 'programmed' to feel sadness? Does that make every time I feel sadness fake? And the argument that emotions equals sapience is erroneous. Emotion is a key part of humanity, but not self awareness.
Anarchist. I'm dating TotallyNotEvilLand, and I love him. I am made whole.

Melly, merely living, surviving, is to suffer. You must fill your life with more to be happy.
Liberate Mallorea and Riva!

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Mon Apr 08, 2013 3:02 pm

Sociobiology wrote:
Aghny wrote:
Neither did i say they were.

but nor are they pure opinion, they are consensus.


Nope. Considering most are created and passed by a select few.

User avatar
Sociobiology
Post Marshal
 
Posts: 18396
Founded: Aug 18, 2010
Ex-Nation

Postby Sociobiology » Mon Apr 08, 2013 3:08 pm

Yankee Empire wrote:
Sociobiology wrote:but nor are they pure opinion, they are consensus.

Consensus of Opinion, that said I think philosophically some opinions can be show to be more valid than others.

sure but don't confuse valid with sound. Philosophy cannot determine if an opinion is sound.
I think we risk becoming the best informed society that has ever died of ignorance. ~Reuben Blades

I got quite annoyed after the Haiti earthquake. A baby was taken from the wreckage and people said it was a miracle. It would have been a miracle had God stopped the earthquake. More wonderful was that a load of evolved monkeys got together to save the life of a child that wasn't theirs. ~Terry Pratchett

User avatar
Sociobiology
Post Marshal
 
Posts: 18396
Founded: Aug 18, 2010
Ex-Nation

Postby Sociobiology » Mon Apr 08, 2013 3:09 pm

Aghny wrote:
Sociobiology wrote:but nor are they pure opinion, they are consensus.


Nope. Considering most are created and passed by a select few.

so you don't know what consensus means?
I think we risk becoming the best informed society that has ever died of ignorance. ~Reuben Blades

I got quite annoyed after the Haiti earthquake. A baby was taken from the wreckage and people said it was a miracle. It would have been a miracle had God stopped the earthquake. More wonderful was that a load of evolved monkeys got together to save the life of a child that wasn't theirs. ~Terry Pratchett

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Mon Apr 08, 2013 3:11 pm

Sociobiology wrote:
Aghny wrote:
Nope. Considering most are created and passed by a select few.

so you don't know what consensus means?


Sure if you mean the consensus of a handful of individuals while there are more than 6 billion people on earth, then i get exactly what you mean.

User avatar
Trotskylvania
Post Marshal
 
Posts: 17217
Founded: Jul 07, 2006
Ex-Nation

Postby Trotskylvania » Mon Apr 08, 2013 3:11 pm

Sociobiology wrote:
Yankee Empire wrote:Consensus of Opinion, that said I think philosophically some opinions can be show to be more valid than others.

sure but don't confuse valid with sound. Philosophy cannot determine if an opinion is sound.

Only if you think that the various methods of empirical inquiry are distinct from philosophy.
Your Friendly Neighborhood Ultra - The Left Wing of the Impossible
Putting the '-sadism' in Posadism


"The hell of capitalism is the firm, not the fact that the firm has a boss."- Bordiga

User avatar
Akitawnui
Lobbyist
 
Posts: 16
Founded: Apr 06, 2013
Ex-Nation

Postby Akitawnui » Mon Apr 08, 2013 3:12 pm

Dark Luna wrote:We shouldn't create artificial intelligence capable of being conscious in the first place.


And there the danger lies.

Frankly, what you think to be science is wrong. Incomplete.


Thank you for that.
There's only one instant, and it's right now. And its eternity.

The idea is to remain in a state of constant departure while always arriving.

User avatar
Trotskylvania
Post Marshal
 
Posts: 17217
Founded: Jul 07, 2006
Ex-Nation

Postby Trotskylvania » Mon Apr 08, 2013 3:12 pm

Dark Luna wrote:
Sociobiology wrote:any machine capable of complex variable/novel tasks will have to be conscious.
you can't have one without the other.


Not necessarily.

No, that's pretty much the definition of consciousness as far as we can tell.
Your Friendly Neighborhood Ultra - The Left Wing of the Impossible
Putting the '-sadism' in Posadism


"The hell of capitalism is the firm, not the fact that the firm has a boss."- Bordiga

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Mon Apr 08, 2013 3:13 pm

Trotskylvania wrote:
Dark Luna wrote:
Not necessarily.

No, that's pretty much the definition of consciousness as far as we can tell.


We don't really understand consciousness fully though.

User avatar
Trotskylvania
Post Marshal
 
Posts: 17217
Founded: Jul 07, 2006
Ex-Nation

Postby Trotskylvania » Mon Apr 08, 2013 3:15 pm

Aghny wrote:
Trotskylvania wrote:No, that's pretty much the definition of consciousness as far as we can tell.


We don't really understand consciousness fully though.

We don't have to in order to make this determination as far as the actual facts of complex computational/decision making systems go.
Your Friendly Neighborhood Ultra - The Left Wing of the Impossible
Putting the '-sadism' in Posadism


"The hell of capitalism is the firm, not the fact that the firm has a boss."- Bordiga

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Mon Apr 08, 2013 3:17 pm

Trotskylvania wrote:
Aghny wrote:
We don't really understand consciousness fully though.

We don't have to in order to make this determination as far as the actual facts of complex computational/decision making systems go.


I'd disagree. If you don't understand something, then you could be just as wrong as right.

User avatar
Akitawnui
Lobbyist
 
Posts: 16
Founded: Apr 06, 2013
Ex-Nation

Postby Akitawnui » Mon Apr 08, 2013 3:20 pm

Trotskylvania wrote:
Aghny wrote:
We don't really understand consciousness fully though.

We don't have to in order to make this determination as far as the actual facts of complex computational/decision making systems go.


Complex decision making can be strictly deductive, or programmed as such. Theres no parallels with complex abstract thinking..
There's only one instant, and it's right now. And its eternity.

The idea is to remain in a state of constant departure while always arriving.

User avatar
Trotskylvania
Post Marshal
 
Posts: 17217
Founded: Jul 07, 2006
Ex-Nation

Postby Trotskylvania » Mon Apr 08, 2013 3:26 pm

Aghny wrote:
Trotskylvania wrote:We don't have to in order to make this determination as far as the actual facts of complex computational/decision making systems go.


I'd disagree. If you don't understand something, then you could be just as wrong as right.

No, there are many degrees of wrong, just as there are many degrees of understanding.

And from what we do understand about consciousness means that it will be highly improbable that any future change in our understanding will mean that we'll be able to have conscious entities that are not capable of complex, variable and novel tasks, and vice versa. Just like even though gravitation is poorly understood, there is really never going to be a scientific discovery that would lead us to conclude that massive bodies don't attract.
Your Friendly Neighborhood Ultra - The Left Wing of the Impossible
Putting the '-sadism' in Posadism


"The hell of capitalism is the firm, not the fact that the firm has a boss."- Bordiga

User avatar
Aghny
Diplomat
 
Posts: 949
Founded: Mar 22, 2013
Ex-Nation

Postby Aghny » Mon Apr 08, 2013 3:29 pm

Trotskylvania wrote:
Aghny wrote:
I'd disagree. If you don't understand something, then you could be just as wrong as right.

No, there are many degrees of wrong, just as there are many degrees of understanding.

And from what we do understand about consciousness means that it will be highly improbable that any future change in our understanding will mean that we'll be able to have conscious entities that are not capable of complex, variable and novel tasks, and vice versa. Just like even though gravitation is poorly understood, there is really never going to be a scientific discovery that would lead us to conclude that massive bodies don't attract.


I was going more along the lines of like, those factors alone might not entirely and accurately define consciousness.
Last edited by Aghny on Mon Apr 08, 2013 3:30 pm, edited 1 time in total.

User avatar
Phocidaea
Negotiator
 
Posts: 5316
Founded: Jul 21, 2012
Ex-Nation

Postby Phocidaea » Mon Apr 08, 2013 3:37 pm

I hope not.

I'd pull the plug on the thing the millisecond it decided to do something stupid. Don't want any GLaDOSes or HAL 9000s.
Call me Phoca.
Senator [Unknown] of the Liberal Democrats in NSG Senate.
Je suis Charlie: Because your feels don't justify murder.

User avatar
Sociobiology
Post Marshal
 
Posts: 18396
Founded: Aug 18, 2010
Ex-Nation

Postby Sociobiology » Mon Apr 08, 2013 3:43 pm

Trotskylvania wrote:
Sociobiology wrote:sure but don't confuse valid with sound. Philosophy cannot determine if an opinion is sound.

Only if you think that the various methods of empirical inquiry are distinct from philosophy.

they are generally defined as such, being neither inductive nor deductive.
I think we risk becoming the best informed society that has ever died of ignorance. ~Reuben Blades

I got quite annoyed after the Haiti earthquake. A baby was taken from the wreckage and people said it was a miracle. It would have been a miracle had God stopped the earthquake. More wonderful was that a load of evolved monkeys got together to save the life of a child that wasn't theirs. ~Terry Pratchett

User avatar
Sociobiology
Post Marshal
 
Posts: 18396
Founded: Aug 18, 2010
Ex-Nation

Postby Sociobiology » Mon Apr 08, 2013 3:45 pm

Aghny wrote:
Trotskylvania wrote:We don't have to in order to make this determination as far as the actual facts of complex computational/decision making systems go.


I'd disagree. If you don't understand something, then you could be just as wrong as right.

so the earth continuing to orbit the sun tomorrow is exactly as likely as it stopping tomorrow, because we don't fully understand gravity?
Last edited by Sociobiology on Mon Apr 08, 2013 3:45 pm, edited 1 time in total.
I think we risk becoming the best informed society that has ever died of ignorance. ~Reuben Blades

I got quite annoyed after the Haiti earthquake. A baby was taken from the wreckage and people said it was a miracle. It would have been a miracle had God stopped the earthquake. More wonderful was that a load of evolved monkeys got together to save the life of a child that wasn't theirs. ~Terry Pratchett

User avatar
Sociobiology
Post Marshal
 
Posts: 18396
Founded: Aug 18, 2010
Ex-Nation

Postby Sociobiology » Mon Apr 08, 2013 3:48 pm

Akitawnui wrote:
Trotskylvania wrote:We don't have to in order to make this determination as far as the actual facts of complex computational/decision making systems go.


Complex decision making can be strictly deductive, or programmed as such.

but novel reasoning cannot, and highly variable reasoning cannot be at a functional level, because said machine will run out of time in the universe before it runs out of possibilities.
I think we risk becoming the best informed society that has ever died of ignorance. ~Reuben Blades

I got quite annoyed after the Haiti earthquake. A baby was taken from the wreckage and people said it was a miracle. It would have been a miracle had God stopped the earthquake. More wonderful was that a load of evolved monkeys got together to save the life of a child that wasn't theirs. ~Terry Pratchett

User avatar
Akitawnui
Lobbyist
 
Posts: 16
Founded: Apr 06, 2013
Ex-Nation

Postby Akitawnui » Mon Apr 08, 2013 4:11 pm

Sociobiology wrote:
Akitawnui wrote:
Complex decision making can be strictly deductive, or programmed as such.

but novel reasoning cannot, and highly variable reasoning cannot be at a functional level, because said machine will run out of time in the universe before it runs out of possibilities.


I agree but that machine can refer to the human brain as well
There's only one instant, and it's right now. And its eternity.

The idea is to remain in a state of constant departure while always arriving.

User avatar
Salandriagado
Postmaster of the Fleet
 
Posts: 22831
Founded: Apr 03, 2008
Ex-Nation

Postby Salandriagado » Mon Apr 08, 2013 4:43 pm

Saubre wrote:Unless an AI suddenly feels emotions without outside influence or asks questions about its existence, then one could argue for its sentience.

Though, If the AI does manage to show emotion of any sort, then it's most likely due to the fact it was given artificially.

I can see no possibility as to how a machine of any sort would suddenly gain one day sentience by itself.

Only humans can have emotions, (Don't be a smart ass and say animals have emotions too. What I'm implying should be clear.) hence emotions are an alien concept to AI unless programmed or given by their creators.

Therefore, they can't achieve true sentience since most likely their very emotions are fake.


You really have no idea what "AI" means, do you?
Cosara wrote:
Anachronous Rex wrote:Good thing most a majority of people aren't so small-minded, and frightened of other's sexuality.

Over 40% (including me), are, so I fixed the post for accuracy.

Vilatania wrote:
Salandriagado wrote:
Notice that the link is to the notes from a university course on probability. You clearly have nothing beyond the most absurdly simplistic understanding of the subject.
By choosing 1, you no longer have 0 probability of choosing 1. End of subject.

(read up the quote stack)

Deal. £3000 do?[/quote]

Of course.[/quote]

User avatar
Salandriagado
Postmaster of the Fleet
 
Posts: 22831
Founded: Apr 03, 2008
Ex-Nation

Postby Salandriagado » Mon Apr 08, 2013 4:45 pm

Aghny wrote:
Camicon wrote:[region-tag=][/region-tag]
What I want to say would undoubtedly get me banned, so I won't say it.

However...
This entire thread is based upon the hypothesis that we create a sapient AI. SAPIENCE DEFINES PERSONHOOD.
This thread is based up on the hypothesis that WE CREATED AN ARTIFICIAL PERSON.
The argument is about whether or not the PEOPLE ARE DESERVING OF BASIC HUMAN RIGHTS.

Get it?


Only a few things.

1) Sapience doesn't define personhood scientifically


"Person" is not a scientific concept. It is a legal term.

2) AI are not persons unless ofcourse proved scientifically
3) AI are not people unless ofcourse proved scientifically


AIs, by definition, are persons.


Aghny wrote:
Esternial wrote:They can be applied in science, though. Doesn't make them less useful.

In my opinion philosophy is quite useful in the field of science.


Only you can't prove ethical or philosophical claims as far as i am aware. Then again those two are not really my strong fields.


You are, in fact, exactly wrong. Scientific claims cannot by proven. Some philosophical ones (those of the more formal side of philosophy) can be proven.

Uelvan wrote:You'd have to prove the AI is alive, and not just mimicing something that is alive. If it were that case, you'd be nothing more than a little kid crying when his/her favorite toy was broken, and refusing to get a new one is done so out of selfishness. In such case, if the AI needed to be destroyed or reprogrammed.


Kindly prove that you are alive, and not just mimicking something that is alive.
Last edited by Salandriagado on Mon Apr 08, 2013 4:50 pm, edited 1 time in total.
Cosara wrote:
Anachronous Rex wrote:Good thing most a majority of people aren't so small-minded, and frightened of other's sexuality.

Over 40% (including me), are, so I fixed the post for accuracy.

Vilatania wrote:
Salandriagado wrote:
Notice that the link is to the notes from a university course on probability. You clearly have nothing beyond the most absurdly simplistic understanding of the subject.
By choosing 1, you no longer have 0 probability of choosing 1. End of subject.

(read up the quote stack)

Deal. £3000 do?[/quote]

Of course.[/quote]

User avatar
Grad Duchy of Luxembourg
Ambassador
 
Posts: 1925
Founded: Nov 22, 2012
Ex-Nation

Postby Grad Duchy of Luxembourg » Mon Apr 08, 2013 4:49 pm

Aghny wrote:
Anachronous Rex wrote:That is explicitly not what I am suggesting. Learn to read.

That is basically what you wrote. Learn to writer better then.


If you could say one intelligent thing about quantum mechanics, I would be impressed.


I did. Not my fault that you can't comprehend something that you have no knowledge of.

Aghny, are you going around pretending to be a professor again? :roll: tsk* tsk* tsk* Am I going to have to put you in your place again?
Economic Left/Right: -3.00
Member of Caninope Contingent

Social Libertarian/Authoritarian: -3.64

User avatar
Esternial
Technical Moderator
 
Posts: 54369
Founded: May 09, 2009
Inoffensive Centrist Democracy

Postby Esternial » Mon Apr 08, 2013 4:54 pm

Crumlark wrote:
Saubre wrote:Unless an AI suddenly feels emotions without outside influence or asks questions about its existence, then one could argue for its sentience.

Though, If the AI does manage to show emotion of any sort, then it's most likely due to the fact it was given artificially.

I can see no possibility as to how a machine of any sort would suddenly gain one day sentience by itself.

Only humans can have emotions, (Don't be a smart ass and say animals have emotions too. What I'm implying should be clear.) hence emotions are an alien concept to AI unless programmed or given by their creators.

Therefore, they can't achieve true sentience since most likely their very emotions are fake.

From when I was little, I did not know what sadness was. I noticed negative energy, and did not understand it. Eventually, my parents took me aside, and explained it to me. Did that make me less human, that I had to be 'programmed' to feel sadness? Does that make every time I feel sadness fake? And the argument that emotions equals sapience is erroneous. Emotion is a key part of humanity, but not self awareness.

It probably doesn't need emotions to be considered 'alive' in the first place. Why some want an AI to exhibit humanity per se is beyond me.

PreviousNext

Advertisement

Remove ads

Return to General

Who is online

Users browsing this forum: Northern Seleucia, Northern Socialist Council Republics, Rary, Rusozak, The North Polish Union, The Plough Islands, Z-Zone 3

Advertisement

Remove ads