bendedreality.com
| 'Everyone on Earth Will Die,' Top AI Researcher Warns
"The most likely result of building a superhumanly smart AI, under anything remotely like the current circumstances, is that literally everyone on Earth will die,..." via: RT Image: AFP / Yoshikazu Tsuno Humanity is unprepared to survive an encounter with a much smarter artificial intelligence, Eliezer Yudkowsky says Shutting down the development of advanced artificial intelligence systems around the globe and harshly punishing those violating the moratorium is the only way to save humanity from extinction, a high-profile AI researcher has warned. Eliezer Yudkowsky, a co-founder of the Machine Intelligence Research Institute (MIRI), has written an opinion piece for TIME magazine on Wednesday, explaining why he didn't sign a petition calling upon "all AI labs to immediately pause for at least six months the training of AI systems more powerful than GPT-4," a multimodal large language model, released by OpenAI earlier this month. Yudkowsky argued that the letter, signed by the likes of