#3: Extropic - Why Thermodynamic Computing is the Future of AI (PUBLIC DEBUT)
Episode 3: Extropic is building a new kind of computer – not classical bits, nor quantum qubits, but a secret, more complex third thing. They call it a Thermodynamic Computer, and it might be many orders of magnitude more powerful than even the most powerful supercomputers today.
Check out their “litepaper” to learn more: https://www.extropic.ai/future.
======
(00:00) - Intro
(00:41) - Guillaume's Background
(02:40) - Trevor's Background
(04:02) - What is Extropic Building? High-Level Explanation
(07:07) - Frustrations with Quantum Computing and Noise
(10:08) - Scaling Digital Computers and Thermal Noise Challenges
(13:20) - How Digital Computers Run Sampling Algorithms Inefficiently
(17:27) - Limitations of Gaussian Distributions in ML
(20:12) - Why GPUs are Good at Deep Learning but Not Sampling
(23:05) - Extropic's Approach: Harnessing Noise with Thermodynamic Computers
(28:37) - Bounding the Noise: Not Too Noisy, Not Too Pristine
(31:10) - How Thermodynamic Computers Work: Inputs, Parameters, Outputs
(37:14) - No Quantum Coherence in Thermodynamic Computers
(41:37) - Gaining Confidence in the Idea Over Time
(44:49) - Using Superconductors and Scaling to Silicon
(47:53) - Thermodynamic Computing vs Neuromorphic Computing
(50:51) - Disrupting Computing and AI from First Principles
(52:52) - Early Applications in Low Data, Probabilistic Domains
(54:49) - Vast Potential for New Devices and Algorithms in AI's Early Days
(57:22) - Building the Next S-Curve to Extend Moore's Law for AI
(59:34) - The Meaning and Purpose Behind Extropic's Mission
(01:04:54) - Call for Talented Builders to Join Extropic
(01:09:34) - Putting Ideas Out There and Creating Value for the Universe
(01:11:35) - Conclusion and Wrap-Up
======
Links:
- Christian Keil – https://twitter.com/pronounced_kyle
- Guillaume Verd - https://twitter.com/GillVerd
- Beff Jezos - https://twitter.com/BasedBeffJezos
- Trevor McCourt - https://twitter.com/trevormccrt1
First Principles:
- Gaussian Distribution: https://en.wikipedia.org/wiki/Normal_distribution
- Energy-Based Models: https://en.wikipedia.org/wiki/Energy-based_model
- Shannon’s Theorem: https://en.wikipedia.org/wiki/Noisy-channel_coding_theorem
======
Production and marketing by The Deep View (https://thedeepview.co). For inquiries about sponsoring the podcast, email team@firstprinciples.fm
======
Checkout the video version here → http://tinyurl.com/4fh497n9
🔔 Follow to stay updated with new uploads
Episode Video
Creators and Guests
Guest
Guillaume Verdon
Founder & CEO @Extropic_AI • prev: Physics & AI R&D @ (Alphabet X / Google) • Founder @ TensorFlow Quantum • (PhD(ABD) + MMath) @ (IQC / UWaterloo / PI) • e/acc
Guest
Trevor McCourt
CTO @Extropic_AI. PhD (ABD) + MSc @MITEECS + @MitQuanta + @FieteGroup. former @GoogleAI Quantum, former @UWaterloo Mechanical Engineering