The Hard-Luck Case For AGI And AI Superintelligence As An Extinction-Level Event
AGI and ASI might produce an extinction-level event that wipes out humanity. Not good. This is an existential risk of the worst kind. Here is the AI insider scoop.
