The board did, however, recommend that Meta revise its “incoherent” manipulated media policy.
The Oversight Board is pressuring Meta to revise its policy regarding manipulated media, describing the restrictions that are now in place as “incoherent.” The reprimand is part of a decision that is being widely monitored regarding a video of President Joe Biden that was manipulated in a way that was misleading.
With regard to the choice that Meta made to not remove the clip that was at the focus of the issue, the board ultimately decided to side with Meta. The president was seen in the movie accompanying his granddaughter as she cast her first vote in person in October of 2022. The video incorporated footage from that time period. After the election, he placed a sticker that read “I voted” on her shirt, as evidenced by footage from the news. Subsequently, a person on Facebook published an edited version of the video that looped the scene in such a way that it appeared as though he continuously stroked her chest. He was referred to as a “sick pedophile” in the caption that accompanied the video clip, and those who voted for him were described as “mentally being unwell.”
The Oversight Board stated in its conclusion that the video did not violate Meta’s narrowly drafted policy regarding modified material. This was due to the fact that the film was not changed using artificial intelligence technologies, and the alterations were “obvious and therefore unlikely to mislead” the majority of users. “However, the Board is concerned about the Manipulated media policy in its current form, finding it to be incoherent, lacking in persuasive justification, and inappropriately focused on how content has been created rather than on which specific harms it aims to prevent (for example, to electoral processes),” the board wrote. “The Board is concerned about the policy when it is in its current form.” Given the amount of elections that will take place in 2024, Meta ought to “reconsider this policy as soon as possible.”
In their current form, the company’s regulations are solely applicable to videos that have been modified using artificial intelligence; they do not cover any other sorts of editing that can be misleading. In the policy suggestions that it has provided to Meta, the Oversight Board has suggested that the company adopt additional regulations that encompass both audio and video content. Not only should the prohibition apply to speech that conveys deceptive information, but it should also apply to “content showing people doing things they did not do.” As stated by the board, these regulations ought to be implemented “regardless of the method of being created.” A further recommendation made by the board is that Meta should no longer remove postings that contain altered media if the content in question does not violate any other rules. Instead, the board recommends that Meta “apply a label indicating that the content is significantly layered and may fail to provide accurate information.”
Researchers and civil society organizations are becoming increasingly concerned about the potential for a new wave of viral election misinformation to be enabled by the growth in the number of AI technologies. The proposals highlight this growing concern. A spokesman for Meta stated in a statement that the company is “reviewing the Oversight Board’s guidance and will respond publicly” within the next sixty days when the statement was released. Despite the fact that this response would be sent far in advance of the presidential election in 2024, it is not apparent when or if any policy changes will be implemented. Representatives of Meta have noted that the firm “plans to update the Manipulated Media policy to respond to the evolution of new and increasingly realistic AI,” as stated in the decision that was made by the Oversight Board.