Ilya Sutskever proposed a doomsday bunker for OpenAI researchers. He suggested this during a 2023 meeting about AGI's potential risks. Sutskever frequently mentioned the bunker in internal discussions.