Facebook removes deepfake video of Ukrainian president

President Volodymyr Zelenskyy
Photo by Chris McGrath via Getty Images

A viral deepfake video of Ukraine‘s president Volodymyr Zelenskyy asking the people of his country to ‘lay down arms’ has been removed by Facebook.

Meta’s (Facebook’s parent company) head of security policy, Nathaniel Gleicher has released a statement via a Twitter thread, stating that the fake video violates Facebook’s policy against “manipulated media.”

“We’ve quickly reviewed and removed this video for violating our policy against misleading manipulated media, and notified our peers at other platforms.”

President Zelenskyy also released a video on Instagram, confirming that he’s still in Ukraine, fighting for the country and that Russian soldiers should be the ones to lay down their weapons and return home.

The Ukrainian Center for Strategic Communications and Information Security released a statement back on March 3, alerting its citizens about deepfake videos of President Zelenskyy appearing on social media. They warned that those videos are there to sow panic and convince people to retreat.

“Imagine seeing Volodymyr Zelenskyy on TV making a surrender statement. You see it, you hear it – so it’s true. But this is not the truth. This is deepfake technology. It is not a real video, but created through machine learning algorithms. Videos made through such technologies are almost impossible to distinguish from the real ones.”

The deepfake video was first broadcasted on TV24, a Ukrainian TV station that was reportedly hacked. The altered video sees the Ukrainian president on the podium, surrendering to the Russian army while telling the country’s troops to lay down their weapons and flee with their families.

After the video was later circulated on social media, many instantly recognized that it was fake due to multiple inconsistencies such as the pixelated face of President Zelensky compared to the rest of his body and his uncharacteristically deeper voice.

In 2020, Meta announced that deepfake videos will be banned on the platform to stop the spread of misinformation, months before the presidential election. But according to Facebook, the policy does not extend to deepfakes used in parody or satire. In June 2020, Meta also launched the Deepfake Detection Challenge Dataset to help accelerate the detection of deepfake videos.