OpenAI’s doomsday bunker, 15 fakest books, Hollywood’s over with Veo 3: AI Eye

OpenAI’s doomsday bunker plan, the “potential benefits” of propaganda bots, plus the best fake books you can’t read this summer. AI Eye..
In the last AI Eye, we reported that scientists from the four leading AI companies believe theres at least a 10% chance of AI killing around 50% of humanity in the next decade one scientist was buying farmland in the US Midwest so he could ride out the AI apocalypse.
This week, it emerged that another doomsday prepper is OpenAI co-founder Ilya Sutskever.
According to Empire of AI author Karen Hao, he told key scientists in mid-2023 that “we’re definitely going to build a bunker before we release AGI.” Artificial General Intelligence is the vaguely defined idea for a sentient intelligence smarter than humans.
The underground compound would protect the scientists from geopolitical chaos or violent competition between world powers once AGI was released.