Scam accounts are taking advantage of the Titan submersible implosion to spread fake, AI-generated images that claim to show debris on the seafloor.
On Thursday, the US Coast Guard announced it had found some wreckage of the Titan, with all five passengers onboard presumed dead. No official photos of the debris have been released, but that didn’t stop some people from circulating fake, AI-generated pics.
Starting on Thursday, several accounts on Twitter and Facebook shared photos that claimed to show the Titan’s debris at the bottom of the ocean.
One of the AI-generated images of the wreckage.But if you look closely, the images are seriously off. For one, the wreckage looks more like a destroyed rocket engine than the submersible. The images also appear too clear and perfectly lit when the wreckage is deep underwater where there is no light. On Thursday, the US Coast Guard said it had found the “tail section” of the submersible off the bow of the sunken Titanic, which is currently 12,500 below sea level.
The actual OceanGate Titan submersibleOther discrepancies include how a couple pictures appear to show the sea's surface at the top, and coral growing over the wreckage when the Titan likely imploded only days ago.
Another AI-generated photo shows the sea surface at the top. The wreckage is also strangely covered in coral.Meanwhile, a separate image that shows shoes within the debris isn’t AI-generated. Instead, it’s an actual photo taken at the site of the sunken Titanic almost 20 years ago.
Despite the apparent flaws, the suspected AI-generated images received up to 480,000 views from users on Twitter. In addition, some of the accounts pushing the images have the blue checkmark, which used to denote legitimacy, but can now be purchased for just a few bucks.
Of note: One of the accounts spreading the fake images features John F. Kennedy, Jr. and his wife Carolyn Bessette as its profile photo, an obvious sign the account is associated with conspiracy groups like Q-Anon rather than a legitimate news source.
Some users who noticed the fakery are calling on the Twitter accounts to pull the images. At the same time, Twitter's Community Notes feature has been slapped on many of the posts in an effort to debunk the misinformation. Still, the issue underscores how AI-generated pictures can easily fill a void in the absence of real photos of a news event.
The origins of the suspected AI-generated images aren't totally clear. But it looks like three of them came from a parody account called the "Prince of Deepfakes," who's used the Midjourney AI-image generator before.