Back in January, I wrote:
What happens when AI image generation becomes powerful enough not to replace artists (or true imagination!), but to credibly remix photographs and movies in a way that we can no longer tell if they’re true or not?
Well, that didn’t take long: The latest update to Midjourney—which can now generate photorealistic faces—has spawned a flurry of images that show, among other things, Donald Trump getting arrested or the pope in an arresting outfit. Celebrities in fantastical situations are only the most obvious use, though: Some people are also generating events that never happened.
Something wild is happening on the Midjourney subreddit.— Justine Moore (@venturetwins) March 26, 2023
People are telling stories and sharing photos of historic events - like the “Great Cascadia” earthquake that devastated Oregon in 2001.
The kicker? It never happened. The images are AI-generated. pic.twitter.com/2ziHJYsTDK
What’s surprising about this wholly unsurprising development is how the AI nails the visual style of the early 2000s: Although this earthquake never happened, the purported footage of it looks entirely credible to me, who spent a lot of time consuming media around that age. The colors are veering towards grey and brown, the outfits are correct, and the lo-fi American television resolution feels completely appropriate for the time.
At the pace the technology is moving, we’re very quickly approaching a future where past events can be entirely constructed and pass the smell test—enhancing all the worst mechanisms of the post-truth age we live in.