Experts Warn UN Panel About the Dangers of Artificial Superintelligence
Copyright by gizmodo.com
During a recent United Nations meeting about emerging global risks, political representatives from around the world were warned about the threats posed by and other future technologies.
The meeting featured two prominent experts on the matter, Max Tegmark, a physicist at MIT, and , the founder of Oxford’s Future of Humanity Institute and author of the book Superintelligence: Paths, Dangers, Strategies.
Both agreed that has the potential to transform human society in profoundly positive ways, but they also raised questions about how the technology could quickly get out of control and turn against us.
Last year, Tegmark, along with physicist Stephen Hawking, computer science professor Stuart Russell, and physicist Frank Wilczek warned about the current culture of complacency regarding superintelligent machines.
The event, organized by Georgia’s UN representatives and the UN Interregional Crime and Justice Research Institute (UNICRI), was set up to foster discussion about the national and international security risks posed by new technologies, including chemical, biological, radiological, and nuclear (CBRN) materials.
The panel was also treated to a special discussion on the potential threats raised by artificial superintelligence—that is, whose capabilities greatly exceed those of humans. The purpose of the meeting, held on October 14, was to discuss the implications of emerging technologies, and how to proactively mitigate the risks.
“One can imagine such technology outsmarting financial markets, out-inventing human researchers, out-manipulating human leaders, and developing weapons we cannot even understand,” the authors wrote. “Whereas the short-term impact of depends on who controls it, the long-term impact depends on whether it can be controlled at all.” Nick Bostrom AdvertisementSponsoredIndeed, as Bostrom explained to those in attendance, superintelligence raises unique technical and foundational challenges, and the […]