NATION

PASSWORD

Technological Singularity - it'll happen? When?

For discussion and debate about anything. (Not a roleplay related forum; out-of-character commentary only.)

The Technological Singularity is going to happen, sooner or later?

Yes
34
58%
No
25
42%
 
Total votes : 59

User avatar
Chessmistress
Negotiator
 
Posts: 5269
Founded: Mar 16, 2015
Iron Fist Consumerists

Technological Singularity - it'll happen? When?

Postby Chessmistress » Tue Dec 06, 2016 9:40 am

For those who, eventually, don't know what "technological singularity" is, a brief citation from the wiki
The technological singularity (also, simply, the singularity)[1][2] is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.[3] According to this hypothesis, an upgradable intelligent agent (such as a computer running software-based artificial general intelligence) would enter a 'runaway reaction' of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an intelligence explosion and resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence.[3][4] Science fiction author Vernor Vinge said in his essay The Coming Technological Singularity that this would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate.


Worth noting that such article isn't perfect, since Vernor Vinge isn't just only a science fiction author but also an academic.

Some more informations here
https://www.newscientist.com/article/mg ... intellect/

And here, Vernor Vinge - this is perhaps the most interesting source even if it's outdated in many ways:
https://www-rohan.sdsu.edu/faculty/ving ... arity.html


Given that.
Most scientists and academics (among those who agree, not all think that a technological singularity can happen) foresee the technological singularity happening about 2040-2045, others think it's not going to happen.
Personally I think that it'll happen but, given that many - maybe even most - expected technological developments have historically, and particularly in the last decades, proved to arrive later than expected, we're going to have the technological singularity later than that, perhaps somewhere in between 2060 and 2070.
I think that is going to happen, sooner or later, because it seems to me the most logical outcome from the technological developments we have see during the last 70 years, since ENIAC.

What do you think, NSGs?
It's going to happen?
And, if so, when?

Note: OP is focused on "it'll happen?" and "when?" rather than on the consequences, just because almost all scientists and academics agree that the consequences on the human race cannot be foreseeable, given the nature of the subject.
But every thought about the consequences is welcome, naturally.
OOC:
Radical Feminist, caring about the oppressed gender, that's why I have a strong sense of justice.

PRO:
Radical Feminism (proudly SWERF - moderately TERF),
Gender abolitionism,
birth control and population control,
affirmative ongoing VERBAL consent,
death penalty for rapists.

AGAINST:
patriarchy,
pornography,
heteronormativity,
domestic violence and femicide.


Favorite Quotes: http://www.nationstates.net/nation=ches ... /id=403173

User avatar
Luziyca
Post Czar
 
Posts: 38285
Founded: Nov 13, 2011
Civil Rights Lovefest

Postby Luziyca » Tue Dec 06, 2016 10:58 am

If we manage to not destroy humanity in the intervening time period, then we can probably reach it. However, I imagine that the transition when the first superintelligences get deployed could cause substantial chaos, and would make humanity effectively obsolete, especially if they can produce themselves.
|||The Kingdom of Rwizikuru|||
Your feeble attempts to change the very nature of how time itself has been organized by mankind shall fall on barren ground and bear no fruit
WikiFacebookKylaris: the best region for eight years runningAbout meYouTubePolitical compass

User avatar
Tinfect
Negotiator
 
Posts: 5235
Founded: Jul 04, 2014
Democratic Socialists

Postby Tinfect » Tue Dec 06, 2016 10:58 am

Hopefully after the downfall of capitalism, so that Humanity can most benefit from it.
Raslin Seretis, Imperial Diplomatic Envoy, He/Him
Tolarn Feren, Civil Oversight Representative, He/Him
Jasot Rehlan, Military Oversight Representative, She/Her


Bisexual, Transgender (She/Her), Native-American, and Actual CommunistTM.

Imperium Central News Network: EMERGENCY ALERT: ALL CITIZENS ARE TO PROCEED TO EVACUATION SITES IMMEDIATELY | EMERGENCY ALERT: ALL FURTHER SUBSPACE SIGNALS AND SYSTEMS ARE TO BE DISABLED IMMEDIATELY | EMERGENCY ALERT: THE FOLLOWING SYSTEMS ARE ACCESS PROHIBITED BY STANDARD/BLACKOUT [Error: Format Unrecognized] | Indomitable Bastard #283
||||||||||||||||||||||||||||||||||||||||

User avatar
Spaceman Spliff
Chargé d'Affaires
 
Posts: 438
Founded: Oct 28, 2016
Ex-Nation

Postby Spaceman Spliff » Tue Dec 06, 2016 11:00 am

How far along are we in any artificial intelligence?
Luziyca wrote:If we manage to not destroy humanity in the intervening time period, then we can probably reach it. However, I imagine that the transition when the first superintelligences get deployed could cause substantial chaos, and would make humanity effectively obsolete, especially if they can produce themselves.

We'll just program them to MLG carry us.
SPACEMAN SPLIFF: MASTURBANDO AD NSG ANNO DOMINI 2171
E stēllīs lībertās
I shit you not, this is the weirdest nation I've made in my history on NS. It's a puppet based on a joke. Shoo.

This poster has two political positions. One is of a speciesist, isolationist, corporatist, liberal spacer from the 26th century.
The other is of an alien studying human behavior.
The East Marches wrote:
The United Colonies of Earth wrote:the weak will be torn to pieces and feasted on by the strong, like morsels of steak


I am TEM and I approve of this message.
Copy and paste this if you are bourgeois

User avatar
Luziyca
Post Czar
 
Posts: 38285
Founded: Nov 13, 2011
Civil Rights Lovefest

Postby Luziyca » Tue Dec 06, 2016 11:00 am

Tinfect wrote:Hopefully after the downfall of capitalism, so that Humanity can most benefit from it.

Or perhaps it will precipitate the downfall of capitalism, since with a large human underclass, a superintelligence will mean that existing human societies can no longer be effectively maintained.
|||The Kingdom of Rwizikuru|||
Your feeble attempts to change the very nature of how time itself has been organized by mankind shall fall on barren ground and bear no fruit
WikiFacebookKylaris: the best region for eight years runningAbout meYouTubePolitical compass

User avatar
Chessmistress
Negotiator
 
Posts: 5269
Founded: Mar 16, 2015
Iron Fist Consumerists

Postby Chessmistress » Tue Dec 06, 2016 11:01 am

Luziyca wrote:If we manage to not destroy humanity in the intervening time period, then we can probably reach it. However, I imagine that the transition when the first superintelligences get deployed could cause substantial chaos, and would make humanity effectively obsolete, especially if they can produce themselves.


Since they would be immensely more intelligent than us, why they should have problems at reproducing themselves?
OOC:
Radical Feminist, caring about the oppressed gender, that's why I have a strong sense of justice.

PRO:
Radical Feminism (proudly SWERF - moderately TERF),
Gender abolitionism,
birth control and population control,
affirmative ongoing VERBAL consent,
death penalty for rapists.

AGAINST:
patriarchy,
pornography,
heteronormativity,
domestic violence and femicide.


Favorite Quotes: http://www.nationstates.net/nation=ches ... /id=403173

User avatar
Thermodolia
Post Kaiser
 
Posts: 78486
Founded: Oct 07, 2011
Civil Rights Lovefest

Postby Thermodolia » Tue Dec 06, 2016 11:03 am

A non-feminist thread by Chess? Has hell frozen over?


Anyway I do think you should expand the poll beyond the simple yes, no. Because as it's written I can't figure out if a yes vote means that it will happen sooner and not later or vice versa
Male, Jewish, lives somewhere in AZ, Disabled US Military Veteran, Oorah!, I'm GAY!
I'm agent #69 in the Gaystapo!
>The Sons of Adam: I'd crown myself monarch... cuz why not?
>>Dumb Ideologies: Why not turn yourself into a penguin and build an igloo at the centre of the Earth?
Click for Da Funies

RIP Dya

User avatar
Tinfect
Negotiator
 
Posts: 5235
Founded: Jul 04, 2014
Democratic Socialists

Postby Tinfect » Tue Dec 06, 2016 11:05 am

Luziyca wrote:
Tinfect wrote:Hopefully after the downfall of capitalism, so that Humanity can most benefit from it.

Or perhaps it will precipitate the downfall of capitalism, since with a large human underclass, a superintelligence will mean that existing human societies can no longer be effectively maintained.


Potentially, but I'd rather Humanity be prepared for it, rather than it having to fix Humanity. If we want to still be relevant to these superintelligences, removing Human Obsolescence is necessary, and doing that will require the downfall of Capitalism, and the establishment of a transhumanistic socialist society.
Raslin Seretis, Imperial Diplomatic Envoy, He/Him
Tolarn Feren, Civil Oversight Representative, He/Him
Jasot Rehlan, Military Oversight Representative, She/Her


Bisexual, Transgender (She/Her), Native-American, and Actual CommunistTM.

Imperium Central News Network: EMERGENCY ALERT: ALL CITIZENS ARE TO PROCEED TO EVACUATION SITES IMMEDIATELY | EMERGENCY ALERT: ALL FURTHER SUBSPACE SIGNALS AND SYSTEMS ARE TO BE DISABLED IMMEDIATELY | EMERGENCY ALERT: THE FOLLOWING SYSTEMS ARE ACCESS PROHIBITED BY STANDARD/BLACKOUT [Error: Format Unrecognized] | Indomitable Bastard #283
||||||||||||||||||||||||||||||||||||||||

User avatar
Chessmistress
Negotiator
 
Posts: 5269
Founded: Mar 16, 2015
Iron Fist Consumerists

Postby Chessmistress » Tue Dec 06, 2016 11:06 am

Luziyca wrote:
Tinfect wrote:Hopefully after the downfall of capitalism, so that Humanity can most benefit from it.

Or perhaps it will precipitate the downfall of capitalism, since with a large human underclass, a superintelligence will mean that existing human societies can no longer be effectively maintained.


I suspect that after an initial period of chaos and unstability they'll stop to care about us, probably abandoning us and the Earth, leaving some advanced technologies as thank. Then, after that, sometimes they'll contact us again, making greetings, and bringing some gifts.

That's what the sons who are far superior to their parents usually do.
OOC:
Radical Feminist, caring about the oppressed gender, that's why I have a strong sense of justice.

PRO:
Radical Feminism (proudly SWERF - moderately TERF),
Gender abolitionism,
birth control and population control,
affirmative ongoing VERBAL consent,
death penalty for rapists.

AGAINST:
patriarchy,
pornography,
heteronormativity,
domestic violence and femicide.


Favorite Quotes: http://www.nationstates.net/nation=ches ... /id=403173

User avatar
Chessmistress
Negotiator
 
Posts: 5269
Founded: Mar 16, 2015
Iron Fist Consumerists

Postby Chessmistress » Tue Dec 06, 2016 11:16 am

Thermodolia wrote:A non-feminist thread by Chess? Has hell frozen over?


I have to admit that I'm a little shy about making non-feminist threads, I fear facing too much criticism from people who are more prepared than me in fields other than Feminism. Exaggerated and venomous criticism as revenge for my opinion in the Feminist threads, I mean.
Also recently I did make a thread "The General Boats Thread" but I erased it after few minutes: my partner have a lot of money, we have a very good boat and, as all humans, our dreams are much larger than our reality. I thought that if I would had posted about I'm dreaming of motoryachts, people would have attacked me because Feminists are supposed to be all poors and communists and giving all their money to the refugees.
OOC:
Radical Feminist, caring about the oppressed gender, that's why I have a strong sense of justice.

PRO:
Radical Feminism (proudly SWERF - moderately TERF),
Gender abolitionism,
birth control and population control,
affirmative ongoing VERBAL consent,
death penalty for rapists.

AGAINST:
patriarchy,
pornography,
heteronormativity,
domestic violence and femicide.


Favorite Quotes: http://www.nationstates.net/nation=ches ... /id=403173

User avatar
Lady Scylla
Post Marshal
 
Posts: 15673
Founded: Nov 22, 2015
Ex-Nation

Postby Lady Scylla » Tue Dec 06, 2016 11:24 am

Chessmistress wrote:For those who, eventually, don't know what "technological singularity" is, a brief citation from the wiki
The technological singularity (also, simply, the singularity)[1][2] is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization.[3] According to this hypothesis, an upgradable intelligent agent (such as a computer running software-based artificial general intelligence) would enter a 'runaway reaction' of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an intelligence explosion and resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence.[3][4] Science fiction author Vernor Vinge said in his essay The Coming Technological Singularity that this would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate.


Worth noting that such article isn't perfect, since Vernor Vinge isn't just only a science fiction author but also an academic.

Some more informations here
https://www.newscientist.com/article/mg ... intellect/

And here, Vernor Vinge - this is perhaps the most interesting source even if it's outdated in many ways:
https://www-rohan.sdsu.edu/faculty/ving ... arity.html


Given that.
Most scientists and academics (among those who agree, not all think that a technological singularity can happen) foresee the technological singularity happening about 2040-2045, others think it's not going to happen.
Personally I think that it'll happen but, given that many - maybe even most - expected technological developments have historically, and particularly in the last decades, proved to arrive later than expected, we're going to have the technological singularity later than that, perhaps somewhere in between 2060 and 2070.
I think that is going to happen, sooner or later, because it seems to me the most logical outcome from the technological developments we have see during the last 70 years, since ENIAC.

What do you think, NSGs?
It's going to happen?
And, if so, when?

Note: OP is focused on "it'll happen?" and "when?" rather than on the consequences, just because almost all scientists and academics agree that the consequences on the human race cannot be foreseeable, given the nature of the subject.
But every thought about the consequences is welcome, naturally.


Wow, a thread absent of RadFem theory. Colour me surprised. As far as a technological singularity, I feel it'll happen between 2020-and 2040, personally. Given the rapid pace lately over biomechanics and networking makes me thing we're reaching the 'event horizon' for it. As far as artificial intelligence, I feel the fears of it are hyperbolic.

User avatar
Chessmistress
Negotiator
 
Posts: 5269
Founded: Mar 16, 2015
Iron Fist Consumerists

Postby Chessmistress » Tue Dec 06, 2016 11:45 am

Lady Scylla wrote: As far as artificial intelligence, I feel the fears of it are hyperbolic.


I don't think that people fears artificial intelligence with level of intelligence of a monkey, people fear something more intelligent, way more, than humans.
I don't see why we should be afraid by that.

I understand even less worries about "obsolescence of humanity": all animals other than humans are "obsolete" since humans evolved. We didn't purposefully wiped out them. Worth noticing that such thing will start, immediatly, with way more intelligence and way more situational awareness, than us all. It'll understand us better than we understand ourselves. It'll have no need to use us as "slaves": we are too unefficient since it can invent and make things that we cannot even imagine.
That's why I foresee such AI leaving the Earth as "human reserve" and colonizing outerspace - there's even the chance that such "human reserve" will be progressively extended to multiple start systems and it'll be the AI giving us the technology needed for space travels.
We'll just lost some pride, but not so much, since this AI will still be our product.
Last edited by Chessmistress on Tue Dec 06, 2016 11:46 am, edited 1 time in total.
OOC:
Radical Feminist, caring about the oppressed gender, that's why I have a strong sense of justice.

PRO:
Radical Feminism (proudly SWERF - moderately TERF),
Gender abolitionism,
birth control and population control,
affirmative ongoing VERBAL consent,
death penalty for rapists.

AGAINST:
patriarchy,
pornography,
heteronormativity,
domestic violence and femicide.


Favorite Quotes: http://www.nationstates.net/nation=ches ... /id=403173

User avatar
Mefpan
Negotiator
 
Posts: 5872
Founded: Oct 23, 2012
Ex-Nation

Postby Mefpan » Tue Dec 06, 2016 11:54 am

Hm. My hope is that we don't make the mistake of seeing a super-AI like that as a mere tool, to be honest. I'd really hate for any creation that is bound to outlive us to march through time having fostered a deep-rooted hatred for its creators.

Chessmistress wrote:
Thermodolia wrote:A non-feminist thread by Chess? Has hell frozen over?


I have to admit that I'm a little shy about making non-feminist threads, I fear facing too much criticism from people who are more prepared than me in fields other than Feminism. Exaggerated and venomous criticism as revenge for my opinion in the Feminist threads, I mean.
Also recently I did make a thread "The General Boats Thread" but I erased it after few minutes: my partner have a lot of money, we have a very good boat and, as all humans, our dreams are much larger than our reality. I thought that if I would had posted about I'm dreaming of motoryachts, people would have attacked me because Feminists are supposed to be all poors and communists and giving all their money to the refugees.

You're afraid of getting torn a metaphorical new one over genuinely interesting topics when your feminist threads cause shitstorms that would rate as a Class 6 Hurricane if the post fallout was turned into a tropical storm?

Jesus. Just post more often about stuff like this, and people will eventually stop reflexively searching for the third-wave feminist meathook buried in whatever you've posted about.
I support thermonuclear warfare. Do you want to play a game of chess?
NationStates' umpteenth dirty ex-leftist class traitor.
I left the Left when it turned Right. Now I'm going back to the Right because it's all that's Left.
Yeah, Screw Realism!
Loyal Planet of Mankind

User avatar
Lady Scylla
Post Marshal
 
Posts: 15673
Founded: Nov 22, 2015
Ex-Nation

Postby Lady Scylla » Tue Dec 06, 2016 11:56 am

Chessmistress wrote:
Lady Scylla wrote: As far as artificial intelligence, I feel the fears of it are hyperbolic.


I don't think that people fears artificial intelligence with level of intelligence of a monkey, people fear something more intelligent, way more, than humans.
I don't see why we should be afraid by that.

I understand even less worries about "obsolescence of humanity": all animals other than humans are "obsolete" since humans evolved. We didn't purposefully wiped out them. Worth noticing that such thing will start, immediatly, with way more intelligence and way more situational awareness, than us all. It'll understand us better than we understand ourselves. It'll have no need to use us as "slaves": we are too unefficient since it can invent and make things that we cannot even imagine.
That's why I foresee such AI leaving the Earth as "human reserve" and colonizing outerspace - there's even the chance that such "human reserve" will be progressively extended to multiple start systems and it'll be the AI giving us the technology needed for space travels.
We'll just lost some pride, but not so much, since this AI will still be our product.


It's unlikely AI would be any more intelligent than a human. At current, "AI" of today is very limited, it isn't true AI because it's only as good as the programmer has programmed it. In order for us to create a true artificial intelligence, we'd need to program it to have uncertainty. It a) needs to be able to refuse its directives [As an example, our biological directives are self-preservation and reproduction] b) be able to make mistakes based off of its own reasoning, c) be able to question and program itself. I've always posited that, if you treat an AI as a science experiment, it's bound to see you as their captors. If you want AI to understand humanity, and have human qualities, you must treat it as such.

User avatar
The Serbian Empire
Khan of Spam
 
Posts: 58107
Founded: Apr 18, 2012
Ex-Nation

Postby The Serbian Empire » Tue Dec 06, 2016 12:16 pm

I expect that it will never be reached. The threats of climate change and lack of materials necessary to make such complex AI and never run out in the first place is a major limitation on the matter.
LOVEWHOYOUARE~ WOMAN
Level 12 Myrmidon, Level ⑨ Tsundere, Level ✿ Hold My Flower
Bad Idea Purveyor
8 Values: https://8values.github.io/results.html?e=56.1&d=70.2&g=86.5&s=91.9
Political Compass: Economic -10.00 Authoritarian: -9.13
TG for Facebook if you want to friend me
Marissa, Goddess of Stratospheric Reach
preferred pronouns: Female ones
Primarily lesbian, but pansexual in nature

User avatar
Ifreann
Post Overlord
 
Posts: 163931
Founded: Aug 07, 2005
Iron Fist Socialists

Postby Ifreann » Tue Dec 06, 2016 12:17 pm

Never mind that, when can we Sublime?
He/Him

beating the devil
we never run from the devil
we never summon the devil
we never hide from from the devil
we never

User avatar
Chessmistress
Negotiator
 
Posts: 5269
Founded: Mar 16, 2015
Iron Fist Consumerists

Postby Chessmistress » Tue Dec 06, 2016 12:19 pm

Lady Scylla wrote:
It's unlikely AI would be any more intelligent than a human. At current, "AI" of today is very limited, it isn't true AI because it's only as good as the programmer has programmed it. In order for us to create a true artificial intelligence, we'd need to program it to have uncertainty. It a) needs to be able to refuse its directives [As an example, our biological directives are self-preservation and reproduction] b) be able to make mistakes based off of its own reasoning, c) be able to question and program itself. I've always posited that, if you treat an AI as a science experiment, it's bound to see you as their captors. If you want AI to understand humanity, and have human qualities, you must treat it as such.


When I use the word "AI" I always refer to an hypotethical true AI capable of self-consciousness.
I don't consider an actual "AI" something different from a stove or a car, they aren't "intelligent" nor even capable of self-consciousness (while, in example, a monkey is capable of that).
OOC:
Radical Feminist, caring about the oppressed gender, that's why I have a strong sense of justice.

PRO:
Radical Feminism (proudly SWERF - moderately TERF),
Gender abolitionism,
birth control and population control,
affirmative ongoing VERBAL consent,
death penalty for rapists.

AGAINST:
patriarchy,
pornography,
heteronormativity,
domestic violence and femicide.


Favorite Quotes: http://www.nationstates.net/nation=ches ... /id=403173

User avatar
Chessmistress
Negotiator
 
Posts: 5269
Founded: Mar 16, 2015
Iron Fist Consumerists

Postby Chessmistress » Tue Dec 06, 2016 12:22 pm

The Serbian Empire wrote:I expect that it will never be reached. The threats of climate change and lack of materials necessary to make such complex AI and never run out in the first place is a major limitation on the matter.


Climate change?
Materials?
Why?
That thing should be initially barely similar to a supercomputer, not something requiring strange and rare materials.
OOC:
Radical Feminist, caring about the oppressed gender, that's why I have a strong sense of justice.

PRO:
Radical Feminism (proudly SWERF - moderately TERF),
Gender abolitionism,
birth control and population control,
affirmative ongoing VERBAL consent,
death penalty for rapists.

AGAINST:
patriarchy,
pornography,
heteronormativity,
domestic violence and femicide.


Favorite Quotes: http://www.nationstates.net/nation=ches ... /id=403173

User avatar
Dark Triads
Secretary
 
Posts: 31
Founded: Dec 06, 2016
Ex-Nation

Postby Dark Triads » Tue Dec 06, 2016 12:30 pm

Lady Scylla wrote:
Chessmistress wrote:For those who, eventually, don't know what "technological singularity" is, a brief citation from the wiki


Worth noting that such article isn't perfect, since Vernor Vinge isn't just only a science fiction author but also an academic.

Some more informations here
https://www.newscientist.com/article/mg ... intellect/

And here, Vernor Vinge - this is perhaps the most interesting source even if it's outdated in many ways:
https://www-rohan.sdsu.edu/faculty/ving ... arity.html


Given that.
Most scientists and academics (among those who agree, not all think that a technological singularity can happen) foresee the technological singularity happening about 2040-2045, others think it's not going to happen.
Personally I think that it'll happen but, given that many - maybe even most - expected technological developments have historically, and particularly in the last decades, proved to arrive later than expected, we're going to have the technological singularity later than that, perhaps somewhere in between 2060 and 2070.
I think that is going to happen, sooner or later, because it seems to me the most logical outcome from the technological developments we have see during the last 70 years, since ENIAC.

What do you think, NSGs?
It's going to happen?
And, if so, when?

Note: OP is focused on "it'll happen?" and "when?" rather than on the consequences, just because almost all scientists and academics agree that the consequences on the human race cannot be foreseeable, given the nature of the subject.
But every thought about the consequences is welcome, naturally.


Wow, a thread absent of RadFem theory. Colour me surprised. As far as a technological singularity, I feel it'll happen between 2020-and 2040, personally. Given the rapid pace lately over biomechanics and networking makes me thing we're reaching the 'event horizon' for it. As far as artificial intelligence, I feel the fears of it are hyperbolic.

My thoughts exactly. I feel a lot of the hyperbole or scaremongering is no different from the usual fears regarding specific items of scientific progress, such as genetic engineering, radio waves and vaccines. Hell, I remember a conspiracy theory about computers enslaving humans when they were first conceived.


User avatar
Alvecia
Postmaster of the Fleet
 
Posts: 20361
Founded: Aug 17, 2015
Democratic Socialists

Postby Alvecia » Tue Dec 06, 2016 12:51 pm

Dunno, but I look forward to it.

User avatar
USS Monitor
Retired Moderator
 
Posts: 30747
Founded: Jul 01, 2015
Inoffensive Centrist Democracy

Postby USS Monitor » Tue Dec 06, 2016 1:13 pm

This is like 19th century people prattling on about transcontinental pneumatic subways or 20th century people going on about intergalactic space travel. It's an idea that sounds cool on paper and has captured imaginations, but will probably turn out to be much less practical and useful in reality. First of all, to think this will happen in the 21st century is just arrogant -- like someone in 1950 expecting to have intergalactic passenger service by the year 2000. That's not to say there will be no progress. The late 20th century did see substantial progress in space exploration, and the 21st century will probably see substantial progress in the development of AI. There's pretty much always progress of some kind. Thinking your generation will make progress is not arrogant. It's reasonable. Thinking your generation will make more progress in 50 years than other people made in 1000 years is arrogant.

So let's talk about what happens if you make a self-upgrading AI. Well, how much of its time and energy is the AI devoting to its own upgrades? All of that is time and energy that ISN'T being used to remake outside the world. Since information has to be physically stored, there is a limit to how much the AI can upgrade itself without building new hardware. Upgrading hardware depends on the physical tools the AI has access to, not only its intelligence, so that is a limitation. An infinite intelligence that is stored in a desktop computer with no internet connection can't do anything. If the AI does surpass human intelligence, what does that actually mean for the outside world? Let's say it designs something cool like a transcontinental pneumatic subway or an intergalactic spaceship. It still takes time to build those things. Since some of the limits on the speed of progress are physical rather than intellectual, the effects wouldn't be as dramatic as some people imagine.
Don't take life so serious... it isn't permanent... RIP Dyakovo and Ashmoria
19th century steamships may be harmful or fatal if swallowed. In case of accidental ingestion, please seek immediate medical assistance.
༄༅། །འགྲོ་བ་མི་རིགས་ག་ར་དབང་ཆ་འདྲ་མཉམ་འབད་སྒྱེཝ་ལས་ག་ར་གིས་གཅིག་གིས་གཅིག་ལུ་སྤུན་ཆའི་དམ་ཚིག་བསྟན་དགོས།

User avatar
USS Monitor
Retired Moderator
 
Posts: 30747
Founded: Jul 01, 2015
Inoffensive Centrist Democracy

Postby USS Monitor » Tue Dec 06, 2016 1:16 pm

Chessmistress wrote:
Luziyca wrote:If we manage to not destroy humanity in the intervening time period, then we can probably reach it. However, I imagine that the transition when the first superintelligences get deployed could cause substantial chaos, and would make humanity effectively obsolete, especially if they can produce themselves.


Since they would be immensely more intelligent than us, why they should have problems at reproducing themselves?


Because it's a physical process, not an intellectual one. You can't reproduce by just imagining the offspring you want to have.
Don't take life so serious... it isn't permanent... RIP Dyakovo and Ashmoria
19th century steamships may be harmful or fatal if swallowed. In case of accidental ingestion, please seek immediate medical assistance.
༄༅། །འགྲོ་བ་མི་རིགས་ག་ར་དབང་ཆ་འདྲ་མཉམ་འབད་སྒྱེཝ་ལས་ག་ར་གིས་གཅིག་གིས་གཅིག་ལུ་སྤུན་ཆའི་དམ་ཚིག་བསྟན་དགོས།

User avatar
Alvecia
Postmaster of the Fleet
 
Posts: 20361
Founded: Aug 17, 2015
Democratic Socialists

Postby Alvecia » Tue Dec 06, 2016 1:17 pm

USS Monitor wrote:
Chessmistress wrote:
Since they would be immensely more intelligent than us, why they should have problems at reproducing themselves?


Because it's a physical process, not an intellectual one. You can't reproduce by just imagining the offspring you want to have.

Eventually them hard drives gonna fill up too
Last edited by Alvecia on Tue Dec 06, 2016 1:17 pm, edited 1 time in total.

User avatar
USS Monitor
Retired Moderator
 
Posts: 30747
Founded: Jul 01, 2015
Inoffensive Centrist Democracy

Postby USS Monitor » Tue Dec 06, 2016 1:17 pm

Chessmistress wrote:That's what the sons who are far superior to their parents usually do.


Even great men aren't all that far superior to their parents.
Last edited by USS Monitor on Tue Dec 06, 2016 1:18 pm, edited 1 time in total.
Don't take life so serious... it isn't permanent... RIP Dyakovo and Ashmoria
19th century steamships may be harmful or fatal if swallowed. In case of accidental ingestion, please seek immediate medical assistance.
༄༅། །འགྲོ་བ་མི་རིགས་ག་ར་དབང་ཆ་འདྲ་མཉམ་འབད་སྒྱེཝ་ལས་ག་ར་གིས་གཅིག་གིས་གཅིག་ལུ་སྤུན་ཆའི་དམ་ཚིག་བསྟན་དགོས།

Next

Advertisement

Remove ads

Return to General

Who is online

Users browsing this forum: Aadhiris, Bovad, El Lazaro, Hidrandia, Ifreann, Ineva, La Cocina del Bodhi, Luziyca, New Temecula, Saiwana, Statesburg, Subscription, Thal Dorthat, The Vooperian Union, Yingtoner

Advertisement

Remove ads