Imagining a world without technology is hard enough, without Stephen Hawking warning us about it being a perfect recipe for doomsday. The physicist believes that new technologies can bring about "new ways things can go wrong" for human existence.
When asked how the world would end, Hawking said humanity faces larger threats from advancements made in science and technology. To name a few, these include global warming, nuclear wars, and genetically engineered viruses.?
ANDREW COWIE/AFP/Getty Images
His observations came about during the recording of the BBC's annual Reith Lectures - on the nature of black holes - on January 7. He said that an earthly disaster, a "near certainty" in the next 1,000 to 10,000 years will not end humanity. Because by the time it happens, we would have most likely ventured out to find a new home among the stars.
But he also joked and remarked how "we will not establish self-sustaining colonies in space for at least the next hundred years, so we have to be very careful in this period."
The Telegraph
Only last month, Elon Musk had commented on how humans may run out of technologies sooner that they'd like, leading to a complete halt of the Mars Mission. Recently, Hawking had signed a letter along with Musk and Steve Wozniak calling for a ban on autonomous weapons.?
In the letter, Hawking highlighted the upside of having beneficial intelligence. He wrote, "We should shift the goal of AI from creating pure undirected artificial intelligence to creating beneficial intelligence. It might take decades to figure out how to do this, so let¡¯s start researching this today rather than the night before the first strong AI is switched on."