Eliezer Yudkowsky warns AI superintelligence will cause human extinction globally Yudkowsky's MIRI shifted focus in 2022 to accepting humanity's likely demise from AI risks He calls for international treaties and extreme measures to halt AI development