Question:
Is the Technological Singularity inevitable or not?
anonymous
2008-11-05 04:09:19 UTC
Is the possibility of the Technological Singularity high or low?

http://en.wikipedia.org/wiki/Technological_Singularity
Three answers:
kozzm0
2008-11-05 04:41:10 UTC
Better yet how about an answer from a philosopher mathematician...



Key phrase in that article is that the notion of the technological singularity hinges on the notion of machines "surpassing human intellect."



That is about as vague as a notion can get. Just what is considered human intellect, and how is exceeding it supposed to allow machines to suddenly adopt unforeseen methods of improving themselves?



This is no futuristic sci-fi anymore, since computers already have exceeded "human intellect" in many respects, for instance, at the venerated benchmark of chess. In the early '90's Garry Gasparov, the greatest grand master probably ever, predicted no computer could ever beat either him or Anatoly Karpov. Several years later he was soundly thrashed by an IBM supercomputer that is pedestrian by today's standards.



And of course computers have long "exceeded human intellect" at numerical calculation. Imagine where you'd have been yesterday, watching the election without your trusty PC to give you six different sites full of numbers, and the TV networks all getting the updated numbers constantly and flashing them on the screen. No "human intellect" could have brought you all that data or arranged it so nicely for you to view.



Besides this the article doesn't even begin to consider just what other things besides "intellect" would be required for a machine to become unexpectedly self-perpetuating. Programming it to doesn't count. That rules out AI completely, which is entirely a matter of programming, and is predictable at least in terms of probabilities if not certainties. If you play video games like I do, you know the AI has not been made that can think creatively, and likely never will be made. And the reason why is... drumroll...



>>>Machines cannot think creatively because they are NOT ALIVE.<<<



Allowing machines to think creatively as opposed to AI will require a much greater understanding of just what makes living things alive, how they are different from machines, and trying to impart the quality to machines. Two big ways they differ:



1) everything that is alive evolved from primitive beginnings over time; machines are manufactured.



2) everything that is alive seeks to remain alive, at least until it has reproduced, with few exceptions.



living intelligence is not controlled by itself per se but by the organism, the will, volition, etc. In most "thinking" beings with central nervous systems, by far the majority of the system is used for functions that don't resemble "intellectual" things. Such as, maybe, will. Or consciousness. Or "I want a bologna sandwich." Or playing tennis.



Show me the robot that can even BEGIN to play tennis, and I'll show you a robot worth many billions of dollars to robotics and computer experts who would want to know how it ticks. Robots can't be made by computers to even walk in a human manner, let alone move with agility. Nobody knows how we do it, but I can assure you we don't think about it first.



If some computer a la the Terminator or Matrix were to gain some kind of a will and try to take things over, it would have a devil of a time making robots that could beat humans in battle. A technological takeover is no more likely than a room full of disembodied brains taking over. Tech can crunch data but that's about it.



anonymous
2016-03-19 05:13:21 UTC
Supporters of the Technological Singularity are mostly new-age hippies who do not understand our advances in physics. We have found a base length (Planck Length), we've found the basic components of matter, and there's a limit to how many computations we can do with a limited quantity of it. Also, there's an entropic limit which is drawn at black-hole density. Moore's Law cannot continue forever. It will stop sooner or later. Yes, eventually we will engineer some alternative form of intelligence (assuming we survive up until that point), and we will eventually develop better brain-computer interfaces. But there will be no "singularity". If you define the singularity as the point at which humans are no longer capable of keeping up with technology, you are both subscribing to a definition which is not concrete and ignoring the possibility that humans may improve themselves along with technology. TLDR; Silly singularity
Fuzzy
2008-11-05 04:30:20 UTC
Aren't we the most sophisticated technology in existence? The fact that we have called what we are biology, does that change the fact that the processes are chemical, mechanical, and electrical? Thus we are the most superior technology on earth.



Our own brain is then the epitome of technological intelligence known on earth by ourselves. We have tried to make machines that think for themselves, but so far we have some distance to go yet.



If this viewpoint is taken, then it doesn't seem that the process for the Singularity is that simple. The Schwarzenegger type of intelligent machines then seem more like a child's imagination than anything related to reality.



Perhaps you will get lucky and get an answer from a professional in artificial intelligence! Who knows?

---------

If someone disagrees, then please tell me why you don't believe that our bodies are technological in nature?


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...