Social media giants Facebook, Twitter, and YouTube have taken action against President Donald Trump by removing a video across its sites as riots rage at the US Capitol.
Protesters attended a 'March to Save America' rally in Washington before Congress was set to meet to certify the election results. Trump told supporters, which included members of far-right groups, "we will not take it anymore".
Three hours later, the President tweeted a video where he repeated false claims about the election, and social media outlets said this post contributed to ongoing violence and violated misinformation policies.
Twitter went one step further and locked Trump's account for at least 12 hours and removed three of his tweets. Facebook removed the video message and later locked his account for "two policy violations", the company said.
Trump told his supporters in the now-removed footage: "I know your pain. I know you're hurt. We had an election that was stolen from us."
He continued, telling protesters they "have to go home now" and "we have to have peace".
"We have to have law and order. We have to respect our great people in law and order. We don't want anybody hurt," he said, before returning to a message of defiance.
"There's never been a time like this where such a thing happened where they could take it away from all of us, from me, from you, from our country."
Guy Rosen, who oversees Facebook's safety and integrity, said the social media site removed Trump's video as part of an "emergency situation".
"We are taking appropriate emergency measures, including removing President Trump's video," he tweeted.
"We removed it because on balance we believe it contributes to rather than diminishes the risk of ongoing violence."
Twitter initially flagged the video and blocked Trump's supporters from commenting, retweeting, or liking the video, before removing it. The social media giant then said it was locking the President's account for 12 hours and warned any further violations of its rules "will result in permanent suspension" for Trump on its site.
"Our public interest policy - which has guided our enforcement action in this area for years - ends where we believe the risk of harm is higher and/or more severe," the outlet tweeted.
"We'll continue to evaluate the situation in real-time, including examining activity on the ground and statements made off Twitter. We will keep the public informed, including if further escalation in our enforcement approach is necessary."
YouTube said in a statement Trump's video violated "policies regarding content that alleges widespread fraud or errors changed the outcome", according to NBC.
Although social media outlets worked to remove the video, several prominent voices in the tech industry were quick to attack the companies and their actions.
Former Facebook chief security officer Alex Stamos said there had been good arguments for private companies to not silence elected officials, but those arguments "are predicated on the protection of constitutional governance".
"Twitter and Facebook have to cut him off. There are no legitimate equities left and labeling won't do it," he tweeted.
Twitter attached a label to the tweet with the video sent by Trump that said: "This claim of fraud is disputed, and this tweet can't be replied to, retweeted, or liked, due to a risk of violence."
Despite this action, another person launched a scathing attack on Twitter CEO Jack Dorsey and Facebook CEO Mark Zuckerberg.
"You've got blood on your hands, Jack and Zuck. For four years you've rationalized this terror. Inciting violent treason is not a free speech exercise. If you work at those companies, it's on you too. Shut it down," an early investor in Twitter, Chris Sacca, tweeted.
According to CBS, Trump's video has remained live on the Trump campaign's Parler page, a far-right social media site that is focused on "protecting users' rights" and free speech.