Darren McKee on Uncontrollable Superintelligence

Future of Life Institute Podcast - A podcast by Future of Life Institute

Categorie:

Darren McKee joins the podcast to discuss how AI might be difficult to control, which goals and traits AI systems will develop, and whether there's a unified solution to AI alignment. Timestamps: 00:00 Uncontrollable superintelligence 16:41 AI goals and the "virus analogy" 28:36 Speed of AI cognition 39:25 Narrow AI and autonomy 52:23 Reliability of current and future AI 1:02:33 Planning for multiple AI scenarios 1:18:57 Will AIs seek self-preservation? 1:27:57 Is there a unified solution to AI alignment? 1:30:26 Concrete AI safety proposals

Visit the podcast's native language site