Singularity

Welcome to the Oregon Cryonics forum
jordansparks
Site Admin
Posts: 46
Joined: Thu Aug 27, 2015 3:59 pm

Singularity

Postby jordansparks » Wed Feb 22, 2017 3:12 pm

The term "singularity" has been used quite a bit recently to describe a supposed runaway technological growth due to an AI. It's complete nonsense with no good supporting evidence. It seems to be based loosely on Kurzweil's exponential growth graphs or on Moore's Law. In the real world, growth curves are sigmoidal, not exponential. Growth is always subject to limited resources and asymptotes are diagonal, not vertical. I cannot for the life of me figure out why some people think the asymptotes might be vertical when we do not see that pattern in any other natural growth phenomenon. Sigmoidal growth curves also eventually reverse direction and level off; they don't ever shoot off into vertical infinity. Moore's law will obviously end some day. Curves that superficially look exponential are actually doomed to hit a constant (diagonal) growth rate and then eventually level off.
For example, if an AI did reach human intelligence level, it would still not be able to quickly build a better factory to fabricate better chips. This AI would be limited in capital, time, expertise, energy, physical dexterity, raw materials, space, heat buildup, etc. If we have the technological capability to build one superhuman AI, you can bet there will be multiple such AIs being built around the world. A superhuman AI would not emerge all alone. It would instead be accompanied by many other superhuman AIs and human/computer symbiotes. A group of humans and computers working together could also easily be "smarter" than any given isolated AI. Corporations and governments would always be able to outpace any lone AI. Humans would continue to be a critical part of this dynamic competition because the humans would have expertise and physical resources, and would be assisted by symbiotic computers.

jordansparks
Site Admin
Posts: 46
Joined: Thu Aug 27, 2015 3:59 pm

Re: Singularity

Postby jordansparks » Wed Mar 15, 2017 10:59 am

Kurzweil just gave an interview:
https://futurism.com/kkurzweil-claims-that-the-singularity-will-happen-by-2029/
His new definition of the singularity (there are many shifting variations) is the point at which some computers could achieve roughly human-level intelligence. Well, that would be a milestone, not a singularity, because it wouldn't result in any sort of runaway growth or any fundamental changes to civilization. But I do happen to agree with his rather mundane milestone prediction of 2029 for somewhat-human-level AI. I also strongly agree with him that it's absurd to think that a single AI could be a problem, and that there would instead be huge benefits in the form of human-machine symbiosis. On the flip side, his prediction of connecting our neocortices to the cloud by the 2030s is just not realistic. When he says things like that, I feel like he doesn't understand how complex biology is. Instead of direct brain connections, we will have to make do with increasingly complex user interfaces.


Return to “Forum”