Eliezer Yudkowsky is a researcher, writer, and philosopher on the topic of superintelligent AI. Please support this podcast by checking out our sponsors:
Linode: https://linode.com/lex to get $100 free credit
House of Macadamias: https://houseofmacadamias.com/lex and use code LEX to get 20% off your first order
InsideTracker: https://insidetracker.com/lex to get 20% off
EPISODE LINKS:
Eliezer's Twitter: / esyudkowsky
LessWrong Blog: https://lesswrong.com
Eliezer's Blog page: https://www.lesswrong.com/users/eliez...
Books and resources mentioned:
1. AGI Ruin (blog post): https://lesswrong.com/posts/uMQ3cqWDP...
2. Adaptation and Natural Selection: https://amzn.to/40F5gfa
PODCAST INFO:
Podcast website: https://lexfridman.com/podcast
Apple Podcasts: https://apple.co/2lwqZIr
Spotify: https://spoti.fi/2nEwCF8
RSS: https://lexfridman.com/feed/podcast/
Full episodes playlist: • Lex Fridman Podcast
Clips playlist: • Lex Fridman Podcast Clips
OUTLINE:
0:00 - Introduction
0:43 - GPT-4
23:23 - Open sourcing GPT-4
39:41 - Defining AGI
47:38 - AGI alignment
1:30:30 - How AGI may kill us
2:22:51 - Superintelligence
2:30:03 - Evolution
2:36:33 - Consciousness
2:47:04 - Aliens
2:52:35 - AGI Timeline
3:00:35 - Ego
3:06:27 - Advice for young people
3:11:45 - Mortality
3:13:26 - Love
SOCIAL:
Twitter: / lexfridman
LinkedIn: / lexfridman
Facebook: / lexfridman
Instagram: / lexfridman
Medium: / lexfridman
Reddit: / lexfridman
Support on Patreon: / lexfridman
Watch video Eliezer Yudkowsky: Dangers of AI and the End of Human Civilization | Lex Fridman Podcast online without registration, duration hours minute second in high quality. This video was added by user Lex Fridman 30 March 2023, don't forget to share it with your friends and acquaintances, it has been viewed on our site 2,022,591 once and liked it 28 thousand people.