Singularity

Welcome to the Oregon Cryonics forum
jordansparks
Site Admin
Posts: 70
Joined: Thu Aug 27, 2015 3:59 pm

Singularity

Postby jordansparks » Wed Feb 22, 2017 3:12 pm

The term "singularity" has been used quite a bit recently to describe a supposed runaway technological growth due to an AI. It's complete nonsense with no good supporting evidence. It seems to be based loosely on Kurzweil's exponential growth graphs or on Moore's Law. In the real world, growth curves are sigmoidal, not exponential. Growth is always subject to limited resources and asymptotes are diagonal, not vertical. I cannot for the life of me figure out why some people think the asymptotes might be vertical when we do not see that pattern in any other natural growth phenomenon. Sigmoidal growth curves also eventually reverse direction and level off; they don't ever shoot off into vertical infinity. Moore's law will obviously end some day. Curves that superficially look exponential are actually doomed to hit a constant (diagonal) growth rate and then eventually level off.
For example, if an AI did reach human intelligence level, it would still not be able to quickly build a better factory to fabricate better chips. This AI would be limited in capital, time, expertise, energy, physical dexterity, raw materials, space, heat buildup, etc. If we have the technological capability to build one superhuman AI, you can bet there will be multiple such AIs being built around the world. A superhuman AI would not emerge all alone. It would instead be accompanied by many other superhuman AIs and human/computer symbiotes. A group of humans and computers working together could also easily be "smarter" than any given isolated AI. Corporations and governments would always be able to outpace any lone AI. Humans would continue to be a critical part of this dynamic competition because the humans would have expertise and physical resources, and would be assisted by symbiotic computers.

jordansparks
Site Admin
Posts: 70
Joined: Thu Aug 27, 2015 3:59 pm

Re: Singularity

Postby jordansparks » Wed Mar 15, 2017 10:59 am

Kurzweil just gave an interview:
https://futurism.com/kkurzweil-claims-that-the-singularity-will-happen-by-2029/
His new definition of the singularity (there are many shifting variations) is the point at which some computers could achieve roughly human-level intelligence. Well, that would be a milestone, not a singularity, because it wouldn't result in any sort of runaway growth or any fundamental changes to civilization. But I do happen to agree with his rather mundane milestone prediction of 2029 for somewhat-human-level AI. I also strongly agree with him that it's absurd to think that a single AI could be a problem, and that there would instead be huge benefits in the form of human-machine symbiosis. On the flip side, his prediction of connecting our neocortices to the cloud by the 2030s is just not realistic. When he says things like that, I feel like he doesn't understand how complex biology is. Instead of direct brain connections, we will have to make do with increasingly complex user interfaces.

jordansparks
Site Admin
Posts: 70
Joined: Thu Aug 27, 2015 3:59 pm

Re: Singularity

Postby jordansparks » Mon Jul 16, 2018 7:43 am

Great article was just posted:
https://singularityhub.com/2018/07/15/why-most-of-us-fail-to-grasp-coming-exponential-gains-in-ai/#sm.000pvny4r11rndymr3h2ccjdmr2py
Summary: An exponential curve has exactly the same shape along the entirety. It always looks like it's going vertical in the future and always looks flat in the past, but both of those are illusions of how it's displayed. There is no such thing as an elbow in the hockey stick and you never reach any transition point where it suddenly feels like there is a sharp departure from the past. I would paraphrase this as, "there is no such thing as a singularity". Ironically, it's posted on SingularityHub.

I would also like to take this opportunity to revise my estimate for the milestone of somewhat-human-level AI. I was referring to the level of AI that would be capable of driving a car just as a human would, in all weather and road conditions. As the complexities of that task have been made more clear, I no longer see 2029 as the date of that milestone. Let's push that milestone out to about 25 years from now, or 2043. The self-driving cars that come out prior to then will have many limitations related to weather and roads. Another milestone that we might intuitively understand is the first generation of domestic robots. These would certainly require far more intelligence than self-driving cars, taking at least another 25 years. That means 2068 just to get to the very first crude generation of barely-functional domestic robots. But nobody's going to mistake these things for human-level, as they bumble slowly around the house, poorly performing only their basic programmed tasks. So even at the 2068 milestone, the past will look flat. We will never ever have a moment where anything suddenly takes off.


Return to “Forum”