What Biological & Social Systems Can Teach us About AI with Nora Ammann, Cofounder of PIBBSS Research
"The Cognitive Revolution" | AI Builders, Researchers, and Live Player Analysis - A podcast by Erik Torenberg, Nathan Labenz
Categorie:
In this episode, Nathan sits down with Nora Ammann, cofounder and CEO of PIBBSS (Principles of Intelligent Behavior in Biological and Social Systems), a research initiative focused on the principles of intelligent behavior in biological and social systems. If you need an ecommerce platform, check out our sponsor Shopify: https://shopify.com/cognitive for a $1/month trial period. We're hiring across the board at Turpentine and for Erik's personal team on other projects he's incubating. He's hiring a Chief of Staff, EA, Head of Special Projects, Investment Associate, and more. For a list of JDs, check out: eriktorenberg.com. SPONSORS: Shopify is the global commerce platform that helps you sell at every stage of your business. Shopify powers 10% of ALL eCommerce in the US. And Shopify's the global force behind Allbirds, Rothy's, and Brooklinen, and 1,000,000s of other entrepreneurs across 175 countries.From their all-in-one e-commerce platform, to their in-person POS system – wherever and whatever you're selling, Shopify's got you covered. With free Shopify Magic, sell more with less effort by whipping up captivating content that converts – from blog posts to product descriptions using AI. Sign up for $1/month trial period: https://shopify.com/cognitive Omneky is an omnichannel creative generation platform that lets you launch hundreds of thousands of ad iterations that actually work customized across all platforms, with a click of a button. Omneky combines generative AI and real-time advertising data. Mention "Cog Rev" for 10% off. X/SOCIAL @labenz (Nathan) @AmmannNora (Nora) @eriktorenberg (Erik) @CogRev_Podcast TIMESTAMPS: (00:00:00) - Introduction to Nora Amman and PIBS (00:03:25) - Nathan's view that society is not ready for rapid AI development (00:05:55) - Nora's framing of AI safety, alignment, and risks (00:09:31) - The problem of epistemic access to future AI systems (00:12:36) - Categorizing different research approaches to AI safety (00:15:46) - Seeking epistemic robustness through plural perspectives (00:19:00) - Expecting AI systems to continue getting more powerful and different (00:21:13) - Nora's "third way" research approach using biological and social systems (00:25:00) - Defining intelligence broadly across systems that transform inputs to outputs (00:27:29) - Avoiding analogies and anthropomorphism when studying AI systems (00:30:23) - Using multiple perspectives and assumptions to triangulate understanding (00:35:13) - Translating concepts between domains more multifaceted than just "borrowing" (00:38:15) - Learning from interdisciplinary practices like bioengineering (00:42:05) - Key questions about the nature of intelligence and agency (00:46:38) - Understanding hierarchical levels of agentic behavior (00:49:32) - Aligning goals between individuals and organizations (00:53:27) - Assessing the validity of assumptions and analogies about AI (00:58:57) - Informing AI interpretability and evaluation with theory (01:02:12) - Modeling interactions between AI systems (01:04:56) - Using ecology as a framework for AI interactions (01:07:46) - Developing a computational ecology theory (01:11:45) - Learning from historical examples of societal transformations (01:16:59) - Unprecedented speed and autonomy of AI development (01:19:00) - PIBS programs for connecting researchers across domains (01:23:39) - Seeking researchers to tie theory to concrete AI applications (01:26:45) - Need for productive disagreement on AI risks and solutions This show is produced by Turpentine: a network of podcasts, newsletters, and more, covering technology, business, and culture — all from the perspective of industry insiders and experts. We’re launching new shows every week, and we’re looking for industry-leading sponsors — if you think that might be you and your company, email us at [email protected].