Singularity
Posted: Wed Feb 22, 2017 3:12 pm
The term "singularity" has been used quite a bit recently to describe a supposed runaway technological growth due to an AI. It's complete nonsense with no good supporting evidence. It seems to be based loosely on Kurzweil's exponential growth graphs or on Moore's Law. In the real world, growth curves are sigmoidal, not exponential. Growth is always subject to limited resources and asymptotes are diagonal, not vertical. I cannot for the life of me figure out why some people think the asymptotes might be vertical when we do not see that pattern in any other natural growth phenomenon. Sigmoidal growth curves also eventually reverse direction and level off; they don't ever shoot off into vertical infinity. Moore's law will obviously end some day. Curves that superficially look exponential are actually doomed to hit a constant (diagonal) growth rate and then eventually level off.
For example, if an AI did reach human intelligence level, it would still not be able to quickly build a better factory to fabricate better chips. This AI would be limited in capital, time, expertise, energy, physical dexterity, raw materials, space, heat buildup, etc. If we have the technological capability to build one superhuman AI, you can bet there will be multiple such AIs being built around the world. A superhuman AI would not emerge all alone. It would instead be accompanied by many other superhuman AIs and human/computer symbiotes. A group of humans and computers working together could also easily be "smarter" than any given isolated AI. Corporations and governments would always be able to outpace any lone AI. Humans would continue to be a critical part of this dynamic competition because the humans would have expertise and physical resources, and would be assisted by symbiotic computers.
For example, if an AI did reach human intelligence level, it would still not be able to quickly build a better factory to fabricate better chips. This AI would be limited in capital, time, expertise, energy, physical dexterity, raw materials, space, heat buildup, etc. If we have the technological capability to build one superhuman AI, you can bet there will be multiple such AIs being built around the world. A superhuman AI would not emerge all alone. It would instead be accompanied by many other superhuman AIs and human/computer symbiotes. A group of humans and computers working together could also easily be "smarter" than any given isolated AI. Corporations and governments would always be able to outpace any lone AI. Humans would continue to be a critical part of this dynamic competition because the humans would have expertise and physical resources, and would be assisted by symbiotic computers.