Facebook video of Biden prompts probe into Meta content policy
Facebook video of Biden prompts probe into Meta content policy

Unlock Editorial Digest for Free

Meta is facing scrutiny over its policies on manipulating content and “deepfakes” created by artificial intelligence after the company’s moderators refused to remove a Facebook video that falsely described U.S. President Joe Biden as a pedophile.

The Silicon Valley company’s oversight board is an independent Supreme Court-style body established in 2020 and made up of 20 journalists, academics and politicians. The commission said on Tuesday it was opening a case to investigate whether the social media giant’s guidelines on modifying videos and images could “address current and future challenges.”

The investigation, the first of its kind into Meta’s “media manipulation” policies, was prompted by an edited version of a video during the 2022 U.S. midterm elections. In the original clip, Biden placed an “I Voted” sticker on his adult granddaughter’s chest and kissed her on the cheek.

In a Facebook post in May, a seven-second modified version of the clip looped the video, thus repeating the moment when Biden’s hand touched her breast. An accompanying headline called Biden “a pathological pedophile” and those who voted for him were “mentally insane.” The footage is still on Facebook.

Although the editing of Biden’s video did not use artificial intelligence, the committee believes its review and ruling will also set a precedent for both artificial intelligence-generated and human-edited content.

“It touches on the broader issue of how a manipulated media might influence elections in every corner of the world,” said Thomas Hughes, the oversight board’s director of governance.

“Free speech is vital and is the cornerstone of democratic governance,” Hughes said. “However, there are complex questions about what Meta’s human rights responsibilities should be for video content that has been altered to create a misleading impression of a public figure.”

“It’s important that we consider what challenges and best practices Meta should adopt when validating video content at scale,” he added.

The committee’s investigation comes as content altered by artificial intelligence, often called deepfakes, becomes increasingly sophisticated and widely used. There are concerns that false-but-true content from politicians, in particular, could influence voting in upcoming elections. The U.S. election took just over a year.

The Biden case surfaced when a user reported the video to Meta, which did not remove the post and upheld its decision to keep it online following Facebook’s appeals process. As of early September, the video had fewer than 30 views and had yet to be shared.

The unidentified user subsequently appealed the decision to the Oversight Board. Meta confirms that the decision to keep the content on the platform was the right one.

The Biden case is part of a growing investigation by the committee into censorship of content surrounding elections and other civic events.

This year, the board overturned Meta’s decision to preserve a Facebook video featuring a Brazilian general. board No names were disclosed because of the risk of inciting street violence after the election. Previous assessments have focused on the decision to block former U.S. President Donald Trump from using Facebook and videos of Cambodian Prime Minister Hun Sen threatening violence against his political opponents.

Once the board completes its review, it can make non-binding policy recommendations to Meta, which must respond within two months. The committee has invited the public to submit comments anonymously.

Meta reiterated in a post on Tuesday that the video was “simply edited to remove certain parts” and therefore was not a deepfake as discovered by its manipulative media policies.

“Once the board’s deliberations are complete, we will implement the board’s decision and will update this article accordingly,” the company said, adding that the video also did not violate its hate speech or bullying policies.

Additional reporting by Hannah Murphy in San Francisco


Leave a Reply

Your email address will not be published. Required fields are marked *