The Singularity Will Not Be a Problem

Read more on this subject: Robots and Artificial Intelligence
News Story Source: https://www.nextbigfuture.com, by Brian Wang
There are concerns that humans will be dominated and be at the mercy of true artificial superintelligence.

I will say that things are not on track for key details of AI becoming insanely powerful or a huge problem.

Super Powerful Computers are Not Enough for Humanity Going the Way of Neanderthal Scenario
Ray Kurzweil described the Technological Singularity when Strong synthetic general intelligence becomes one billion times the intelligence of all humans. He said this would be in the decades of the 2040s and that this would be the start when synthetic intelligence would significantly start to accelerate faster than humans could keep up.

However, it is not just the billion times the compute power applied to general (ie broadly capable intelligence).

Ray calls "fast thinking" AI intelligence – "weak superhumanity". Such a "weakly superhuman" entity would probably burn out in a few weeks of outside time.
"Strong superhumanity" wo
Read More or Make a Comment

Bookmark the permalink.

Comments are closed.