MrBeast and Tom Hanks used in AI scam adverts

Even the warnings from artificial intelligence’s fiercest advocates that the technology could be used to harm humans may come too late — it’s already happening, with some of the planet’s most famous celebrities among the targets.

Just this week, two of the most famous people on the planet, actor Tom Hanks and YouTube star MrBeast, were the subjects of deepfake images used to sell products.

This week, an artificially generated video of MrBeast (real name Jimmy Donaldson) began circulating on TikTok.

Donaldson earns $82 million a year in the film according to Forbesseemingly telling viewers that they were one of the “lucky” 10,000 people chosen to win an iPhone 15 Pro, a newly released iPhone 15 Pro. Apple launched last month.

In the video, Donaldson claims that winners can pay just $2 for the device, which retails for $999.

The video appears under the username “MrBeast” and features a distinct blue certification mark, with the YouTuber’s avatar reading: “I’m MrBeast and I’m running the world’s largest iPhone 15 giveaway.”

The avatar in the video also speaks in Donaldson’s voice, asking viewers to click on a link attached to the video to “claim” their prize.

Donaldson, who has 189 million subscribers on YouTube, is known for his generous giveaways — such as helping 1,000 people hear for the first time and 1,000 blind people seeing for the first time — which may lead some viewers to believe the videos are real. .

But Donaldson turned to his more than 24 million followers on X, the site formerly known as Twitter, for help. warn them About the scam.

“A lot of people have received this deepfake scam ad of mine,” the 25-year-old wrote. “Are social media platforms ready for the rise of artificial intelligence deepfakes? This is a serious question.”

TikTok did not immediately respond of wealth Requested for comment, other outlets were told the ad and account had been removed from the platform within hours.

The platform has Promise to crack down About deepfake images and videos. In March, TikTok updated its policy to stipulate that all deepfakes, or images that are manipulated but replicate real-life scenes, must be labeled as such.

The short film site also bans deepfakes of private figures or young people, while allowing manipulation of images of public figures, but only for artistic or educational content.

Tom Hanks’ Dental Plan

Forrest Gump and Toy Story A similar incident happened to star Tom Hanks this week.

This weekend, the Oscar-nominated actor posted on his profile Instagram page His likeness was used in an artificial intelligence advertisement for a dental plan.

“Beware!” Hanks wrote. “There’s a video promoting some dental plan with my artificial intelligence version. I have nothing to do with it.”

In addition to the money customers may lose from mobile products or scams, celebrities who have had their images manipulated by deepfake artists say it’s “terrifying” to witness it firsthand.

CBS News host Gayle King spoke this week about her confusion when people started contacting her about the diet gummies.

A clip that Kim reportedly used to promote her podcast Gaelic king at home The voiceover was doctored to appear to show the public figure endorsing the product.

“I’d never had gummies, I didn’t even know what it was about,” Kim told cbs morning show. “It’s scary and it can make you say anything.”

Even people who have been bullish on artificial intelligence, such as Microsoft co-founder Bill Gates and JPMorgan Chase CEO Jamie Dimon, say they worry about the technology falling into the wrong hands.

Gates expressed his concerns in a public appearance July blog postbut Dimon’s comments were more recent, telling Bloomberg TV: “Technology has done incredible things for humanity, but, you know, plane crashes, drugs being abused — those are all negatives.

“In my opinion, the biggest negative impact is artificial intelligence being used by bad people to do bad things. Think cyber warfare.”

Svlook

Leave a Reply

Your email address will not be published. Required fields are marked *