Twitter, Facebook, and Instagram have all removed a Trump campaign video from their platforms after receiving copyright complaints, Reuters reported. The nearly four-minute video featured images of the late George Floyd of Minneapolis, who died May 25th after a police officer kneeled on his neck for more than eight minutes. A video of the incident has prompted nationwide protests of police violence.
Twitter disabled the video, while Facebook and Instagram removed posts containing the video. When President Trump objected to the removal in a tweet, calling it “illegal,” Twitter CEO Jack Dorsey responded: “Not true and not illegal. This was pulled because we got a DMCA complaint from copyright holder.”
Not true and not illegal.
This was pulled because we got a DMCA complaint from copyright holder. https://t.co/RAsaYng71a
— jack (@jack) June 6, 2020
A spokesperson for Facebook, which owns Instagram, told Reuters it also had received a copyright complaint under the Digital Millennium Copyright Act. “Organizations that use original art shared on Instagram are expected to have the right to do so,” the spokesperson said. YouTube did not remove a version of the video from its platform, saying it did not contain the content that violated the copyright. As of Saturday morning, the YouTube version of the video had nearly half a million views.
Let’s go through Trump’s terrible internet censorship order, line by line
It wasn’t clear who filed the copyright complaint about the video, titled “Healing Not Hatred,” which includes images of demonstrations protesting Floyd’s death and a voiceover of a President Trump speech where he says the “death of George Floyd was a grave tragedy.”
Last month, Twitter applied labels to two of President Trump’s tweets, one that used the phrase “when the looting starts, the shooting starts” for “glorifying violence” and another one for being “potentially misleading” about mail-in voting. Trump later issued an executive order governing how websites can moderate content.