The Threats of the Singularity - Will There Be Many Singularities?
why stephen Hawking, Elon Musk and Bill Gates expressed their fears to the public The concept of the singularity has been in the news these past few months. The contemporary usage of the term “singularity” is understood to represent that time when machine intelligence passes beyond and exceeds human intelligence. To some, such an overtaking of human intelligence by machines is far fetched and is believed to be impossible or in some very distant future. Others speak and plan as if the singularity were just around the corner, looming over the near horizon, a decade or two away.
As a result of the latter view, some of the most respected members of the scientific, technical and business communities have seen fit to speak out and warn about the dangers and threats posed by a possible super machine intelligence, with enough brilliance to make copies of itself and to continue to improve itself until machines lord over humankind.
Stephen Hawking, a theoretical physicist warned: “…the development of full artificial intelligence could spell the end of the human race;” and: “Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”
Bill Gates, founder of Microsoft and now a philanthropist, was asked about the existential threat of AI and responded: “I am in the camp that is concerned about super intelligence. …First, the machines will do a lot of jobs for us and not be super intelligent. That should be positive if we manage it well. A few decades after that, though, the intelligence is strong enough to be a concern.”