top of page
Video Series
Videos on Artificial Intelligence produced by iQ Studios.



AI Trends and Predictions for the Future
The field of AI is evolving so rapidly that is difficult for researchers in the field to keep up.


How to Create AGI and Not Die
IFoRE Sigma Xi Conference 2023: Dr. Craig Kaplan provides an overview of AGI and explains the safety challenges and current AI approaches .


Surviving AI, Episode 26 Short: Now is the time to make AI safe!
A short window exists during which humans can influence the development of AGI and SuperInteligence.


Surviving AI, Episode 25 Short: Summary of AI safety
Narrow AI is rapidly developing into AGI which will then inevitably lead to SuperIntelligence much smarter and more powerful than humans.


Surviving AI, Episode 24 Short: More detail on how to build safe AGI
A brief overview of multiple pending patents describing how to build safe and ethical AGI.


Surviving AI, Episode 22 Short: How does the AGI network learn?
The old school idea of proceduralization can help an AGI network of individual AI and human agents learn in a safe and predictable way.


Surviving AI, Episode 17 Short: What is a human CI (AI) network?
Human collective intelligence networks exist today and that existing systems like these can be stitched together to form the fabric of safe


Surviving AI, Episode 16 Short: How to build an AGI Network
The best way to build AGI begins with building a human collective intelligence network and then adding AI agents to that.


Surviving AI, Episode 15 Short: Does Collective Intelligence work?
Online collective intelligence of humans and AI can achieve AGI safely.


Surviving AI, Episode 14 Short: What is Collective Intelligence?
Active Collective Intelligence is needed for safe AGI.


Surviving AI, Episode 12 Short: What is the fastest and safest path to AGI?
AGI is potentially a “winner-take-all” scenario where the first AGI will likely dominate slower approaches.


Surviving AI, Episode 9 Short: Can we increase the odds of human survival with AI?
Nearly half of AI experts surveyed say there is a 10% or greater chance of extinction by AI.


Surviving AI, Episode 3 Short: Can we regulate AI?
Regulation is a standard answer for dangerous technologies, but there are problems with this approach. Dr. Kaplan discusses three major chal


Surviving AI, Episode 1 Short: What is AI?
Learn about the origins of AI.


How to Create AGI and Not Die: IFoRE / Sigma Xi Conference Presentation 2023
A presentation by Dr. Craig A. Kaplan at the IFoRE / Sigma Xi Conference on 11/10/23.


Surviving AI, Episode 26: Now is the time to make AI safe!
There is a relatively short window during which humans can influence the development of AGI and SuperIntelligence.


Surviving AI, Episode 25: Summary of AI Safety
A recap of the main points of the Surviving AI series.


Surviving AI, Episode 24: More detail on how to build safe AGI
An overview of multiple pending patents describing in detail how to build safe and ethical Artificial General Intelligence (AGI).


Surviving AI, Episode 23: What makes the AGI network safe?
Three ways are explained to increase the safety of AGI.


Surviving AI, Episode 22: How does the AGI network learn?
There is more than one way to teach an AI.
bottom of page