[ad_1]
A highly edited video of President Biden on Facebook will remain on the platform after an independent body that oversees Meta’s content moderation determined that the post does not violate the company’s policies, but the panel also criticized the company’s manipulated media policy as “incoherent and confusing.”
The video, posted in May 2023, was edited to make it appear as if Mr. Biden was repeatedly inappropriately touching his adult granddaughter’s chest. In the original video, taken in 2022, the president places an “I voted” sticker on his granddaughter after voting in the midterm elections. But the video under review by Meta’s Oversight Board was looped and edited into a seven-second clip that critics said left a misleading impression.
Meta’s Oversight Board, an independent group that oversees Meta’s content policies and can make binding decisions on whether content is removed or left up, said that the video did not violate Meta’s policies because the video was not altered with artificial intelligence and does not show Mr. Biden “saying words he did not say” or “doing something he did not do.”
A human content reviewer at Meta left the video up after it was reported to the company as hate speech. After an appeal to the Oversight Board, the board took it up for review.
While the Oversight Board ruled the video can remain on the site, it argued in a set of non-binding recommendations that Meta’s current policy regarding manipulated content should be “reconsidered.” The board called the company’s current policy on the issue “incoherent, lacking in persuasive justification and inappropriately focused on how content is created, rather than on which specific harms it aims to prevent, such as disrupting electoral processes.”
The board also recommended Meta should begin labeling manipulated media that does not violate its policies, and that it should include manipulated audio and edited videos showing people “doing things they did not do” as violations of the manipulated media policy.
“Meta needs to calibrate the Manipulated Media policy to the real world harms it seeks to prevent. The company should be clear about what those harms are, for example incitement to violence or misleading people about information needed to vote, and enforce the policy against them,” Oversight Board Co-Chair Michael McConnell said in a statement to CBS News.
“In most cases Meta could prevent harms caused by people being misled by altered content through less restrictive means than removals, which is why we are urging the company to attach labels that would provide context about the authenticity of posts. This would allow for greater protection of free expression,” McConnell added.
“We are reviewing the Oversight Board’s guidance and will respond publicly to their recommendations within 60 days in accordance with the bylaws,” a Meta spokesperson wrote in a statement to CBS News.
The board’s decision was released just a few days after Meta CEO Mark Zuckerberg and other tech company leaders testified before a Senate Judiciary Committee hearing about the impact of social media on children.
And it comes as AI and other editing tools make it easier than ever for users to alter or fabricate realistic-seeming video and audio clips. Ahead of last month’s New Hampshire primary, a fake robocall impersonating President Biden encouraged Democrats not to vote, raising concerns about misinformation and voter suppression going into November’s general election.
McConnell also warned that the Oversight Board is watching how Meta handles content relating to election integrity going into this year’s elections, after the board recommended the company develop a framework for evaluating false and misleading claims around how elections are handled in the U.S. and globally.
“Platforms should keep their foot on the gas beyond election day and into the post-election periods where ballots are still being counted, votes are being certified, and power is being transitioned,” McConnell told CBS News. “Challenging an election’s integrity is generally considered protected speech, but in some circumstances, widespread claims attempting to undermine elections, such as what we saw in Brazil [in 2023], can lead to violence.”
[ad_2]
Source