Twitter Inc and Facebook Inc on Wednesday temporarily locked the accounts of U.S. President Donald Trump, as tech giants scrambled to crack down on his baseless claims about the U.S. presidential elections amid riots in the capital.
Twitter hid and required the removal of three of Trump’s tweets “as a result of the unprecedented and ongoing violent situation in Washington, D.C.,” after pro-Trump protesters stormed the U.S. Capitol in an attempt to force Congress to block the appointment of President-elect Joe Biden.
One woman was shot and killed inside the Capitol building in the chaos.
Facebook later tweeted it would block Trump’s page from posting for 24 hours due to two policy violations.
Twitter locked Trump’s account for 12 hours and said that if the tweets are not removed, the account would remain locked, meaning the president would be unable to tweet from @realDonaldTrump.
READ MORE
NATO chief discusses 'global security' with Trump
Trump taps loyalist Pam Bondi for attorney general after Gaetz withdraws
Facebook and YouTube, owned by Alphabet’s Google, also removed a video in which Trump continued to allege the presidential election was fraudulent even as he urged protesters to go home.
The video was removed from Instagram and the president’s account there would also be locked for 24 hours, Adam Mosseri, chief of Facebook-owned Instagram, said in a tweet.
YouTube did not take any further immediate action against his account.
Tech companies have been under pressure to police misinformation on their platforms around the U.S. election, including through calls by users on Wednesday for major platforms to suspend Trump’s accounts.
The president and his allies have continuously spread unsubstantiated claims of election fraud that have proliferated online. Trump on Wednesday blamed Vice President Mike Pence for lacking “courage” to pursue those claims in a tweet that Twitter later took down.
A White House spokesman did not immediately respond to a request for comment.
RISK OF VIOLENCE
Facebook’s vice president of integrity Guy Rosen tweeted the social media company believed the president’s video “contributes to rather than diminishes the risk of ongoing violence,” saying the action was part of “appropriate emergency measures.”
YouTube said Trump’s video violated its policy against content that alleges “widespread fraud or errors changed the outcome of the 2020 U.S. Election.”
Both Facebook and Twitter had originally added labels and measures to slow the video’s spread.
Dozens of Facebook staffers called for executives to clarify how they were handling Trump’s posts, with some calling for his account to be taken down for inciting the violence at the Capitol, according to internal posts seen by Reuters.
“Can we get some courage and actual action from leadership in response to this behavior? Your silence is disappointing at the least and criminal at worst,” one employee wrote.
Internal communications managers quickly closed comments on the threads, saying in identical posts that updates would be provided but “the priority right now is actively dealing with the ongoing situation.”
Facebook did not immediately respond to a request for comment on the internal posts.
Former Facebook security chief Alex Stamos tweeted: “Twitter and Facebook have to cut him off.”
Civil rights groups including The Anti-Defamation League and Color of Change called for social media companies to suspend Trump’s accounts permanently.
According to researchers and public postings, violent rhetoric and advice on weaponry ramped up significantly in the past three weeks on many social media platforms as multiple groups planned rallies for Wednesday, including Trump supporters, white nationalists and enthusiasts of the wide-ranging conspiracy theory QAnon.