Disgruntled militia wrote: I love simple solutions.
As do I.
Advertisement
by Extra Sauce » Sat Apr 16, 2011 11:02 pm
Milograd wrote:Extra Sauce wrote:It's not a good reason. Fear of the future's potential doesn't justify trying to get in its way. People should be vigilant about how things are used instead of preventing their development.
So you are saying technology and science is nothing to be afraid of? Have you ever seen the effects of human technology through out the ages? Dear lord, there used to be a time when guns and tanks and inter-continental-ballistic-missiles didn't exist. You can't possibly think that technology is nothing to be afraid of.
by Avenio » Sat Apr 16, 2011 11:03 pm
Milograd wrote:So you are saying technology and science is nothing to be afraid of? Have you ever seen the effects of human technology through out the ages? Dear lord, there used to be a time when guns and tanks and inter-continental-ballistic-missiles didn't exist. You can't possibly think that technology is nothing to be afraid of.
by Trollgaard » Sat Apr 16, 2011 11:04 pm
Diseased Imaginings wrote:I don't fear it. In fact, I think once medical tech and neuroscience advance enough in the next few decades, we'll see people first start to have embedded microchips with algorithms and memory integrated into human nervous systems. We're very quickly approaching the verge of a post-organic evolution in humankind, and I find it exhilarating.
by Lacadaemon » Sat Apr 16, 2011 11:08 pm
Avenio wrote:Fear not technology, fear the people that would misuse it. A tank cannot hurt you without a pilot, an ICBM cannot launch on its own and a gun needs to be aimed.
by Disgruntled militia » Sat Apr 16, 2011 11:10 pm
Trollgaard wrote:there is absolutely no reason to create true AI that could eventually be smarter than humans.
by Extra Sauce » Sat Apr 16, 2011 11:10 pm
by Kubra » Sat Apr 16, 2011 11:11 pm
Your point?
by Xarithis » Sat Apr 16, 2011 11:24 pm
Zathganastan wrote:What do most people feel about AI robot's that think for themselves.
Is it a threat to humanity or just anorther part of the future.
by Trollgaard » Sat Apr 16, 2011 11:31 pm
Floreria wrote:Trollgaard wrote:
Well, I don't think that will happen, and don't want it to happen. Its appalling.
Yes, AI has the potential to be a threat, and there is absolutely no reason to create true AI that could eventually be smarter than humans.
Really? Since I was certain that there were, in fact, reasons to do that.
Evolution has its limitations. If we want to improve humanity, which is certainly something that seems to be required in this small part of the universe, we'll need to find new solutions. Nature simply doesn't work that way.
I can understand why these concepts would be apalling. We'll be trusting our imperfect partners in crime here on earth with helping to achieve a more ideal form. It's easily abusable and quite potentially dangerous. Of course any intelligent human being would know that using it for purposes beneficial to us would be the best plan, but how can we be so sure that would happen?
by Disgruntled militia » Sat Apr 16, 2011 11:39 pm
Trollgaard wrote:Humans are fine. We don't need AI. We'll survive or perish by our own hands.
by New Ziedrich » Sat Apr 16, 2011 11:39 pm
by Trollgaard » Sat Apr 16, 2011 11:45 pm
by Natapoc » Sat Apr 16, 2011 11:46 pm
Trollgaard wrote:Diseased Imaginings wrote:I don't fear it. In fact, I think once medical tech and neuroscience advance enough in the next few decades, we'll see people first start to have embedded microchips with algorithms and memory integrated into human nervous systems. We're very quickly approaching the verge of a post-organic evolution in humankind, and I find it exhilarating.
Well, I don't think that will happen, and don't want it to happen. Its appalling.
Yes, AI has the potential to be a threat, and there is absolutely no reason to create true AI that could eventually be smarter than humans.
by Hresejnen » Sat Apr 16, 2011 11:54 pm
by Galla- » Sat Apr 16, 2011 11:56 pm
Fashiontopia wrote:Look don't come here talking bad about Americans, that will get you cussed out faster than relativity.
Besides: Most posters in this thread are Americans, and others who are non-Americans have no problems co-existing so shut that trap...
by Qatarab » Sun Apr 17, 2011 8:10 am
by Norstal » Sun Apr 17, 2011 8:14 am
Zathganastan wrote:What do most people feel about AI robot's that think for themselves.
Is it a threat to humanity or just anorther part of the future.
Toronto Sun wrote:Best poster ever. ★★★★★
New York Times wrote:No one can beat him in debates. 5/5.
IGN wrote:Literally the best game I've ever played. 10/10
NSG Public wrote:What a fucking douchebag.
by Norstal » Sun Apr 17, 2011 8:19 am
Trollgaard wrote:Diseased Imaginings wrote:I don't fear it. In fact, I think once medical tech and neuroscience advance enough in the next few decades, we'll see people first start to have embedded microchips with algorithms and memory integrated into human nervous systems. We're very quickly approaching the verge of a post-organic evolution in humankind, and I find it exhilarating.
Well, I don't think that will happen, and don't want it to happen. Its appalling.
Yes, AI has the potential to be a threat, and there is absolutely no reason to create true AI that could eventually be smarter than humans.
Toronto Sun wrote:Best poster ever. ★★★★★
New York Times wrote:No one can beat him in debates. 5/5.
IGN wrote:Literally the best game I've ever played. 10/10
NSG Public wrote:What a fucking douchebag.
by Siorafrica » Sun Apr 17, 2011 8:21 am
by Bakakishtan » Sun Apr 17, 2011 8:24 am
Advertisement
Users browsing this forum: Aadhiris, Cerespasia, Emotional Support Crocodile, Ifreann, Jerzylvania, Plan Neonie, Shidei, Tungstan
Advertisement