top of page
Video Series
Videos on Artificial Intelligence produced by iQ Studios.



DeepSeek, Nvidia, AI Investing, and Our Future
I want to explain where I believe AI is headed and what this means specifically for investors, and more broadly, for all of humankind.


AI Trends and Predictions for the Future
The field of AI is evolving so rapidly that is difficult for researchers in the field to keep up.


How to Create AGI and Not Die
IFoRE Sigma Xi Conference 2023: Dr. Craig Kaplan provides an overview of AGI and explains the safety challenges and current AI approaches .


Surviving AI, Episode 26 Short: Now is the time to make AI safe!
A short window exists during which humans can influence the development of AGI and SuperInteligence.


Surviving AI, Episode 15 Short: Does Collective Intelligence work?
Online collective intelligence of humans and AI can achieve AGI safely.


Surviving AI, Episode 12 Short: What is the fastest and safest path to AGI?
AGI is potentially a “winner-take-all” scenario where the first AGI will likely dominate slower approaches.


Surviving AI, Episode 11 Short: Should we slow down AI development?
Tens of thousands of AI researchers and others who are concerned called for a pause in the development of the most advanced AI systems.


Surviving AI, Episode 10 Short: How to avoid extinction by AI
Designing AI systems with “humans in the loop” and that have democratic values are two essential principles are key to human survival


Surviving AI, Episode 9 Short: Can we increase the odds of human survival with AI?
Nearly half of AI experts surveyed say there is a 10% or greater chance of extinction by AI.


Surviving AI, Episode 8 Short: How do we make AI safer?
How will Artificial Intelligence determine what is right and what is wrong?


Surviving AI, Episode 3 Short: Can we regulate AI?
Regulation is a standard answer for dangerous technologies, but there are problems with this approach. Dr. Kaplan discusses three major chal


How to Create AGI and Not Die: IFoRE / Sigma Xi Conference Presentation 2023
A presentation by Dr. Craig A. Kaplan at the IFoRE / Sigma Xi Conference on 11/10/23.


Surviving AI, Episode 26: Now is the time to make AI safe!
There is a relatively short window during which humans can influence the development of AGI and SuperIntelligence.


Surviving AI, Episode 19: What are customized AI agents (AAAIs)?
You and I can customize Advanced Autonomous Artificial Intelligences (AAAIs) by teaching them both our knowledge and values.


Surviving AI, Episode 12: What is the Fastest and Safest Path to AGI?
Dr. Kaplan argues that if we know how to build AGI safely, we should actually speed up development instead of slowing it downl


Surviving AI, Episode 9: Can We Increase the Odds of Human Survival with AI?
Nearly half of AI experts say there's a 10% or greater chance of extinction by AI. Imagine if we could improve our survival odds by 1%?


Surviving AI, Episode 7: How Do We Solve The Alignment Problem (post AGI)?
Aligning the values of AGI with positive human values is the key to ensuring that humans survive and prosper.


Surviving AI, Episode 6: The Alignment Problem
AI begins as a tool but it will not remain one; AI will learn to set its own goals. What if its goals don’t align with ours?


Artificial Intelligence: Past, Present, and Future
Dr. Craig Kaplan discusses the past, present, and future of artificial intelligence.


Episode 1: The Evolution of Planetary Intelligence
How planetary intelligence is evolving and why what we do in the next 10 - 20 years will determine the fate of humankind.
bottom of page