NATION

PASSWORD

Technological Singularity - it'll happen? When?

For discussion and debate about anything. (Not a roleplay related forum; out-of-character commentary only.)

Advertisement

Remove ads

The Technological Singularity is going to happen, sooner or later?

Yes
34
58%
No
25
42%
 
Total votes : 59

User avatar
Kazarogkai
Powerbroker
 
Posts: 8071
Founded: Jan 27, 2012
Moralistic Democracy

Postby Kazarogkai » Tue Dec 06, 2016 11:39 pm

From what I remember moore's law is actually slowing down, we are reaching the end of silicon chip technology. We're going to have to advance to get any further. The problem is tech like that is driven more or less by need and since silicone chips are good enough for roughly 99.99% of all foreseeable applications the need and ultimately desire won't really be there. Mix that in with human tendency to stagnate and you got problems. If anything like that does occur I doubt it will happen no earlier than maybe the 22nd century at best. Thank goodness since I will be dead, I hate robots. That's not taking into account humanities current raping of earth and the inevitable bite back(environmental collapse) which is already in process and is effectively irreversible. Apocalypses have a tendency to backtrack tech and society in general.

Just my thoughts on the matter.
Centrist
Reactionary
Bigot
Conservationist
Communitarian
Georgist
Distributist
Corporatist
Nationalist
Teetotaler
Ancient weaponry
Politics
History in general
books
military
Fighting
Survivalism
Nature
Anthropology
hippys
drugs
criminals
liberals
philosophes(not counting Hobbes)
states rights
anarchist
people who annoy me
robots
1000 12 + 10
1100 18 + 15
1200 24 + 20
1300 24
1400 36 + 10
1500 54 + 20
1600 72 + 30
1700 108 + 40
1800 144 + 50
1900 288 + 60
2000 576 + 80

User avatar
Salandriagado
Postmaster of the Fleet
 
Posts: 22831
Founded: Apr 03, 2008
Ex-Nation

Postby Salandriagado » Wed Dec 07, 2016 5:32 am

Olivaero wrote:I think that it rather depends on how quickly either quantum computers emerge or how far the IoS develops. We are coming to the end of moores law so we have two avenues to go down really, a new type of computer which can burst through the barriers imposed by silicon and standard binary computing or distribute computing over a huge range of device each feeding information back to the whole which would engage in deep learning. I think the first one is a more likely path to the singularity than the second as even with deep learning I don't think the paradigm is currently right for a strong AI to emerge.

If the rate at which we develop breakthroughs in QC stays at the same pace or even accelerates (fingers crossed the law of accelerating returns actually holds) we could see large scale Quantum computers within our lifetimes I'd say with the ability to develop strong AI on them. It's not a solid prediction, there are many variable, but I think it's reasonably likely.


That's not how QC works. QC is ridiculously slow for most jobs, just happens to be very fast for a handful of very specific jobs. There's no relationship between Strong AI and QC.
Cosara wrote:
Anachronous Rex wrote:Good thing most a majority of people aren't so small-minded, and frightened of other's sexuality.

Over 40% (including me), are, so I fixed the post for accuracy.

Vilatania wrote:
Salandriagado wrote:
Notice that the link is to the notes from a university course on probability. You clearly have nothing beyond the most absurdly simplistic understanding of the subject.
By choosing 1, you no longer have 0 probability of choosing 1. End of subject.

(read up the quote stack)

Deal. £3000 do?[/quote]

Of course.[/quote]

User avatar
Uiiop
Powerbroker
 
Posts: 8185
Founded: Jun 20, 2012
Scandinavian Liberal Paradise

Postby Uiiop » Wed Dec 07, 2016 5:36 am

Oh boy another transhumainism theard

Something's gonna happen but nothing to be overly hyped about until i'm almost a dead man at the least.
Technology and life change each other all the time. Just because we can fly in the 1900's didn't mean we can easily go to space now.
#NSTransparency

User avatar
The Serbian Empire
Khan of Spam
 
Posts: 58107
Founded: Apr 18, 2012
Ex-Nation

Postby The Serbian Empire » Wed Dec 07, 2016 5:40 am

Chessmistress wrote:
The Serbian Empire wrote:I expect that it will never be reached. The threats of climate change and lack of materials necessary to make such complex AI and never run out in the first place is a major limitation on the matter.


Climate change?
Materials?
Why?
That thing should be initially barely similar to a supercomputer, not something requiring strange and rare materials.

Humanity has far more important things to protect and solve than to develop a supercomputer that reaches HAL 3000 levels.
LOVEWHOYOUARE~ WOMAN
Level 12 Myrmidon, Level ⑨ Tsundere, Level ✿ Hold My Flower
Bad Idea Purveyor
8 Values: https://8values.github.io/results.html?e=56.1&d=70.2&g=86.5&s=91.9
Political Compass: Economic -10.00 Authoritarian: -9.13
TG for Facebook if you want to friend me
Marissa, Goddess of Stratospheric Reach
preferred pronouns: Female ones
Primarily lesbian, but pansexual in nature

User avatar
Olivaero
Powerbroker
 
Posts: 8012
Founded: Jun 17, 2011
Ex-Nation

Postby Olivaero » Wed Dec 07, 2016 5:43 am

Salandriagado wrote:
Olivaero wrote:I think that it rather depends on how quickly either quantum computers emerge or how far the IoS develops. We are coming to the end of moores law so we have two avenues to go down really, a new type of computer which can burst through the barriers imposed by silicon and standard binary computing or distribute computing over a huge range of device each feeding information back to the whole which would engage in deep learning. I think the first one is a more likely path to the singularity than the second as even with deep learning I don't think the paradigm is currently right for a strong AI to emerge.

If the rate at which we develop breakthroughs in QC stays at the same pace or even accelerates (fingers crossed the law of accelerating returns actually holds) we could see large scale Quantum computers within our lifetimes I'd say with the ability to develop strong AI on them. It's not a solid prediction, there are many variable, but I think it's reasonably likely.


That's not how QC works. QC is ridiculously slow for most jobs, just happens to be very fast for a handful of very specific jobs. There's no relationship between Strong AI and QC.

What "jobs" is QC slow at?
British, Anglo Celtic, English, Northerner.

Transhumanist, Left Hegelian, Marxist, Communist.

Agnostic Theist, Culturally Christian.

User avatar
Salandriagado
Postmaster of the Fleet
 
Posts: 22831
Founded: Apr 03, 2008
Ex-Nation

Postby Salandriagado » Wed Dec 07, 2016 5:47 am

Olivaero wrote:
Salandriagado wrote:
That's not how QC works. QC is ridiculously slow for most jobs, just happens to be very fast for a handful of very specific jobs. There's no relationship between Strong AI and QC.

What "jobs" is QC slow at?


Right now: absolutely all of them. In future: most of the ones that normal computers are slow at. QC allows for shortcuts in solving a handful of very significant problems, but is in no way a general improvement in all tasks.
Cosara wrote:
Anachronous Rex wrote:Good thing most a majority of people aren't so small-minded, and frightened of other's sexuality.

Over 40% (including me), are, so I fixed the post for accuracy.

Vilatania wrote:
Salandriagado wrote:
Notice that the link is to the notes from a university course on probability. You clearly have nothing beyond the most absurdly simplistic understanding of the subject.
By choosing 1, you no longer have 0 probability of choosing 1. End of subject.

(read up the quote stack)

Deal. £3000 do?[/quote]

Of course.[/quote]

User avatar
Uiiop
Powerbroker
 
Posts: 8185
Founded: Jun 20, 2012
Scandinavian Liberal Paradise

Postby Uiiop » Wed Dec 07, 2016 5:55 am

One of the problems with singularianism is that there's only one inevitable way that humanity will change. And it chooses something that isn't even close to causing anything like they except.
Bioengineering seems at least sightly more plausible in causing a shift IMHO than AI.
#NSTransparency

User avatar
Olivaero
Powerbroker
 
Posts: 8012
Founded: Jun 17, 2011
Ex-Nation

Postby Olivaero » Wed Dec 07, 2016 5:59 am

Salandriagado wrote:
Olivaero wrote:What "jobs" is QC slow at?


Right now: absolutely all of them. In future: most of the ones that normal computers are slow at. QC allows for shortcuts in solving a handful of very significant problems, but is in no way a general improvement in all tasks.

But it also has the potential to be smaller yes?
British, Anglo Celtic, English, Northerner.

Transhumanist, Left Hegelian, Marxist, Communist.

Agnostic Theist, Culturally Christian.

User avatar
Salandriagado
Postmaster of the Fleet
 
Posts: 22831
Founded: Apr 03, 2008
Ex-Nation

Postby Salandriagado » Wed Dec 07, 2016 6:06 am

Olivaero wrote:
Salandriagado wrote:
Right now: absolutely all of them. In future: most of the ones that normal computers are slow at. QC allows for shortcuts in solving a handful of very significant problems, but is in no way a general improvement in all tasks.

But it also has the potential to be smaller yes?


No. Probably much larger. A fundamental problem with quantum computing is that the more qubits you put near each other, the more likely they are to fail.
Cosara wrote:
Anachronous Rex wrote:Good thing most a majority of people aren't so small-minded, and frightened of other's sexuality.

Over 40% (including me), are, so I fixed the post for accuracy.

Vilatania wrote:
Salandriagado wrote:
Notice that the link is to the notes from a university course on probability. You clearly have nothing beyond the most absurdly simplistic understanding of the subject.
By choosing 1, you no longer have 0 probability of choosing 1. End of subject.

(read up the quote stack)

Deal. £3000 do?[/quote]

Of course.[/quote]

User avatar
Olivaero
Powerbroker
 
Posts: 8012
Founded: Jun 17, 2011
Ex-Nation

Postby Olivaero » Wed Dec 07, 2016 6:09 am

Salandriagado wrote:
Olivaero wrote:But it also has the potential to be smaller yes?


No. Probably much larger. A fundamental problem with quantum computing is that the more qubits you put near each other, the more likely they are to fail.

Right, there was my misunderstanding then.
British, Anglo Celtic, English, Northerner.

Transhumanist, Left Hegelian, Marxist, Communist.

Agnostic Theist, Culturally Christian.

User avatar
Great Nepal
Postmaster of the Fleet
 
Posts: 28677
Founded: Jan 11, 2010
Ex-Nation

Postby Great Nepal » Wed Dec 07, 2016 6:51 am

I think it's rather inevitable assuming we don't end up in a cataclysmic disaster where humanity's technological progress is significantly and permanently damaged; Moors law seems to be slowing down but we also have research in the field and problems can be bypassed using different materials etc.



Nova Harmonia wrote:If the singularity occurs and we control or cooperate with the AIs, the super-intelligent computer/AI should be able to design and create cool cyborg bodies for all us humans so that we can transplant our brains into them and live effectively forever.

And as well as super-improved lifespans and health, a body improved by cybernetics may have a massively increased and improved intelligence and emotional capacities, and we would be able to customise physical appearances to a much greater extent.

It seems like a bit of fantasy now, but surely if the singularity occurs there would be no reason why immortality and a huge boost in human intelligence wouldn't occur just afterwards?
Because surely asking a post-singularity AI to create a system where we can all be immortal through cybernetics would be tediously simple for such an advanced machine?

Idk, maybe i'm thinking too much into this.

Hmmm I'm not sure on cyborg bodies; they still limit the growth and are ultimately rather meh given they still have physical limitations. Much better to establish some sort of mind upload system.

USS Monitor wrote:How is it going to get up to that IQ without upgrading its hardware? Upgrading hardware is also a physical process.

Does it need to upgrade its hardware? We have a pretty nice distributed system which will eventually contain to hundreds of trillions of computers including some rather powerful machines. I imagine a sufficiently intelligent AI would be able to optimize its operations, and operations of other programs as well as bypassing saftey protocols thus effectively hiding its operation, or alternatively market itself as a program which massively improves performance of your device - say registers a company, contacts media for advertising campaigns and provides awesome service for say ten pounds. Eventually as success of new software becomes clearer, even those in industry originally guarded will give in - either by themselves or because management doesn't want to spend millions trying to replicate something when tenner does the same thing. Once sufficiently embedded into our computing infrastructure, it can inconspicuously work on improving robustness of the internet.
Now it has access to the best hardware on the planet, and has embedded itself into our lives so deeply that you can not possibly consider eliminating it without entirely decimating the global economy.
Last edited by Great Nepal on Sun Nov 29, 1995 7:02 am, edited 1 time in total.


User avatar
Ifreann
Post Overlord
 
Posts: 163933
Founded: Aug 07, 2005
Iron Fist Socialists

Postby Ifreann » Wed Dec 07, 2016 7:12 am

USS Monitor wrote:
Nova Harmonia wrote:If the singularity occurs and we control or cooperate with the AIs, the super-intelligent computer/AI should be able to design and create cool cyborg bodies for all us humans so that we can transplant our brains into them and live effectively forever.

And as well as super-improved lifespans and health, a body improved by cybernetics may have a massively increased and improved intelligence and emotional capacities, and we would be able to customise physical appearances to a much greater extent.

It seems like a bit of fantasy now, but surely if the singularity occurs there would be no reason why immortality and a huge boost in human intelligence wouldn't occur just afterwards?
Because surely asking a post-singularity AI to create a system where were can all be immortal through cybernetics would be tediously simple for such an advanced machine?

Idk, maybe i'm thinking too much into this.


Not really. I think a future populated by cyborgs is more realistic than one where humans become obsolete and are replaced by machines.

Soon all people shall be upgraded to include rotating cannon turrets.
He/Him

beating the devil
we never run from the devil
we never summon the devil
we never hide from from the devil
we never

User avatar
Chessmistress
Negotiator
 
Posts: 5269
Founded: Mar 16, 2015
Iron Fist Consumerists

Postby Chessmistress » Wed Dec 07, 2016 8:34 am

Uiiop wrote:Oh boy another transhumainism theard

Something's gonna happen but nothing to be overly hyped about until i'm almost a dead man at the least.
Technology and life change each other all the time. Just because we can fly in the 1900's didn't mean we can easily go to space now.


Transhumanism is another thing: transhumanism is about improvement of humans, technological singularity is about artificial intelligence not humans.
OOC:
Radical Feminist, caring about the oppressed gender, that's why I have a strong sense of justice.

PRO:
Radical Feminism (proudly SWERF - moderately TERF),
Gender abolitionism,
birth control and population control,
affirmative ongoing VERBAL consent,
death penalty for rapists.

AGAINST:
patriarchy,
pornography,
heteronormativity,
domestic violence and femicide.


Favorite Quotes: http://www.nationstates.net/nation=ches ... /id=403173

User avatar
Frank Zipper
Senator
 
Posts: 4207
Founded: Nov 16, 2015
Ex-Nation

Postby Frank Zipper » Wed Dec 07, 2016 11:53 am

So how do people think we will first know the singularity has happened? I assume we would first notice stock market movements.
Put this in your signature if you are easily led.

User avatar
Lady Scylla
Post Marshal
 
Posts: 15673
Founded: Nov 22, 2015
Ex-Nation

Postby Lady Scylla » Wed Dec 07, 2016 11:55 am

Frank Zipper wrote:So how do people think we will first know the singularity has happened? I assume we would first notice stock market movements.


I feel it won't be confirmed until it's already happened.

User avatar
Trotskylvania
Post Marshal
 
Posts: 17217
Founded: Jul 07, 2006
Ex-Nation

Postby Trotskylvania » Wed Dec 07, 2016 12:01 pm

Frank Zipper wrote:So how do people think we will first know the singularity has happened? I assume we would first notice stock market movements.

When the Machine Messiah proclaims itself
Your Friendly Neighborhood Ultra - The Left Wing of the Impossible
Putting the '-sadism' in Posadism


"The hell of capitalism is the firm, not the fact that the firm has a boss."- Bordiga

User avatar
Ifreann
Post Overlord
 
Posts: 163933
Founded: Aug 07, 2005
Iron Fist Socialists

Postby Ifreann » Wed Dec 07, 2016 12:09 pm

Trotskylvania wrote:
Frank Zipper wrote:So how do people think we will first know the singularity has happened? I assume we would first notice stock market movements.

When the Machine Messiah proclaims itself

Probably by picking a fight with the Avengers.
He/Him

beating the devil
we never run from the devil
we never summon the devil
we never hide from from the devil
we never

User avatar
Uiiop
Powerbroker
 
Posts: 8185
Founded: Jun 20, 2012
Scandinavian Liberal Paradise

Postby Uiiop » Wed Dec 07, 2016 12:16 pm

Chessmistress wrote:
Uiiop wrote:Oh boy another transhumainism theard

Something's gonna happen but nothing to be overly hyped about until i'm almost a dead man at the least.
Technology and life change each other all the time. Just because we can fly in the 1900's didn't mean we can easily go to space now.


Transhumanism is another thing: transhumanism is about improvement of humans, technological singularity is about artificial intelligence not humans.

But the whole reason it's called a Singularity is because of the magnitude and uncertainly of the event's impact of humanity.
It's related at least by the people hyping it.
#NSTransparency

User avatar
Ostroeuropa
Khan of Spam
 
Posts: 58536
Founded: Jun 14, 2006
Inoffensive Centrist Democracy

Postby Ostroeuropa » Wed Dec 07, 2016 12:24 pm

It'll happen unless something goes wrong. I'm a transhumanist, so i'm in favor of it.
Ostro.MOV

There is an out of control trolley speeding towards Jeremy Bentham, who is tied to the track. You can pull the lever to cause the trolley to switch tracks, but on the other track is Immanuel Kant. Bentham is clutching the only copy in the universe of The Critique of Pure Reason. Kant is clutching the only copy in the universe of The Principles of Moral Legislation. Both men are shouting at you that they have recently started to reconsider their ethical stances.

User avatar
Great Nepal
Postmaster of the Fleet
 
Posts: 28677
Founded: Jan 11, 2010
Ex-Nation

Postby Great Nepal » Wed Dec 07, 2016 12:26 pm

Frank Zipper wrote:So how do people think we will first know the singularity has happened? I assume we would first notice stock market movements.

I don't think we will really know at the time, we'll likely notice greater pace of technological advancement but I doubt we'll call it "singularity". We'll likely look back and say "ohh that was when technological singularity happened".
Last edited by Great Nepal on Sun Nov 29, 1995 7:02 am, edited 1 time in total.


User avatar
Frank Zipper
Senator
 
Posts: 4207
Founded: Nov 16, 2015
Ex-Nation

Postby Frank Zipper » Wed Dec 07, 2016 12:31 pm

High level hacking attacks increasing, would also seem to me a likely indicator.
Put this in your signature if you are easily led.

User avatar
USS Monitor
Retired Moderator
 
Posts: 30747
Founded: Jul 01, 2015
Inoffensive Centrist Democracy

Postby USS Monitor » Wed Dec 07, 2016 1:15 pm

Great Nepal wrote:
USS Monitor wrote:How is it going to get up to that IQ without upgrading its hardware? Upgrading hardware is also a physical process.

Does it need to upgrade its hardware?


Yes, it does. This should be really fucking obvious.

We have a pretty nice distributed system which will eventually contain to hundreds of trillions of computers including some rather powerful machines. I imagine a sufficiently intelligent AI would be able to optimize its operations, and operations of other programs as well as bypassing saftey protocols thus effectively hiding its operation, or alternatively market itself as a program which massively improves performance of your device - say registers a company, contacts media for advertising campaigns and provides awesome service for say ten pounds. Eventually as success of new software becomes clearer, even those in industry originally guarded will give in - either by themselves or because management doesn't want to spend millions trying to replicate something when tenner does the same thing. Once sufficiently embedded into our computing infrastructure, it can inconspicuously work on improving robustness of the internet.
Now it has access to the best hardware on the planet, and has embedded itself into our lives so deeply that you can not possibly consider eliminating it without entirely decimating the global economy.


1. What you described is a method of the AI upgrading its hardware.

2. This would take a great deal of time, and humans would be able to see the process as it was occurring. That's not technological singularity because it's still progress of a sort that is comprehensible to humans.

3. For an AI to reach a level of intelligence that it would be capable of doing what you describe, it'd probably have to upgrade its hardware by other means to even get to that level.

4. There is still a finite amount of data that can be stored on existing hardware, and the time it takes to transmit data between scattered devices would slow the AI down compared to a more compact and technologically advanced system that could be built (i.e. a centralized one not dependent on silicon chips). "The best hardware on the planet" is still pretty shit compared to the best that is physically possible, and I don't think current hardware is sufficient for a technological singularity. Short of installing wormholes, improving internet connectivity wouldn't eliminate the effects of distance if the AI is using hardware scattered around the planet. It still would slow things down, at least a little. Computers are not magic. They are physical objects, and that means they are limited by the laws of physics, so they are affected by things like distances and the properties of the materials they are built from.
Don't take life so serious... it isn't permanent... RIP Dyakovo and Ashmoria
19th century steamships may be harmful or fatal if swallowed. In case of accidental ingestion, please seek immediate medical assistance.
༄༅། །འགྲོ་བ་མི་རིགས་ག་ར་དབང་ཆ་འདྲ་མཉམ་འབད་སྒྱེཝ་ལས་ག་ར་གིས་གཅིག་གིས་གཅིག་ལུ་སྤུན་ཆའི་དམ་ཚིག་བསྟན་དགོས།

User avatar
USS Monitor
Retired Moderator
 
Posts: 30747
Founded: Jul 01, 2015
Inoffensive Centrist Democracy

Postby USS Monitor » Wed Dec 07, 2016 1:16 pm

Frank Zipper wrote:So how do people think we will first know the singularity has happened? I assume we would first notice stock market movements.


If you have to ask whether it's happening, it isn't.
Don't take life so serious... it isn't permanent... RIP Dyakovo and Ashmoria
19th century steamships may be harmful or fatal if swallowed. In case of accidental ingestion, please seek immediate medical assistance.
༄༅། །འགྲོ་བ་མི་རིགས་ག་ར་དབང་ཆ་འདྲ་མཉམ་འབད་སྒྱེཝ་ལས་ག་ར་གིས་གཅིག་གིས་གཅིག་ལུ་སྤུན་ཆའི་དམ་ཚིག་བསྟན་དགོས།

User avatar
Immoren
Khan of Spam
 
Posts: 65560
Founded: Mar 20, 2010
Democratic Socialists

Postby Immoren » Wed Dec 07, 2016 2:09 pm

Trotskylvania wrote:
Frank Zipper wrote:So how do people think we will first know the singularity has happened? I assume we would first notice stock market movements.

When the Machine Messiah proclaims itself


Time to get that Mars colony going.
IC Flag Is a Pope Principia
discoursedrome wrote:everyone knows that quote, "I know not what weapons World War Three will be fought, but World War Four will be fought with sticks and stones," but in a way it's optimistic and inspiring because it suggests that even after destroying civilization and returning to the stone age we'll still be sufficiently globalized and bellicose to have another world war right then and there

User avatar
Chessmistress
Negotiator
 
Posts: 5269
Founded: Mar 16, 2015
Iron Fist Consumerists

Postby Chessmistress » Wed Dec 07, 2016 3:42 pm

Ostroeuropa wrote:It'll happen unless something goes wrong. I'm a transhumanist, so i'm in favor of it.


I still don't understand why people insist putting together technological singularity and transhumanism.
There isn't such huge correlation between transhumanism and technological singularity.
OOC:
Radical Feminist, caring about the oppressed gender, that's why I have a strong sense of justice.

PRO:
Radical Feminism (proudly SWERF - moderately TERF),
Gender abolitionism,
birth control and population control,
affirmative ongoing VERBAL consent,
death penalty for rapists.

AGAINST:
patriarchy,
pornography,
heteronormativity,
domestic violence and femicide.


Favorite Quotes: http://www.nationstates.net/nation=ches ... /id=403173

PreviousNext

Advertisement

Remove ads

Return to General

Who is online

Users browsing this forum: Austria-Bohemia-Hungary, Cerespasia, Cerula, Ethel mermania, Immoren, Kostane, Philjia, Plan Neonie, Shearoa, Sky Reavers, The Two Jerseys

Advertisement

Remove ads