WASHINGTON (Reuters) – After the live streaming on social media of the massive shooting in New Zealand, the chairman of the U.S. House Committee on Homeland Security wrote a letter to the top executives of the four large technology companies urging them to do a better job of removing violent political content.
FILE PHOTO: Homeland Security Committee Chairman Bennie Thompson (D-MS) listens to a testimony of the Department of Homeland Security Secretary Kirstjen Nielsen during a House Homeland Security Committee hearing on “The Way Forward on border Security” on Capitol Hill in Washington, USA, March 6, 2019. REUTERS/Joshua Roberts?
In a letter dated Monday and released on Tuesday, Representative Bennie Thompson urged the chief executives of Facebook, the Alphabet is Google, owner of YouTube, Twitter, and Microsoft to quickly remove content that would spawn political extremism.
The letter follows the fatal shooting of 50 worshippers at two mosques in Christchurch last week. The shooter, a suspected white supremacist, live-streamed the murders on social media, where it was widely shared.
“Your businesses must give priority to answering this toxic and violent ideologies with resources and attention,” Thomson wrote. “If you’re not willing to do this, Congress should consider policies to ensure that terrorist content will not be distributed on your platforms, including by studying the examples set by other countries.
“The video is widely available on your platforms and after the attack, despite calls from the New Zealand authorities to these videos,” he wrote.
Facebook said it removed 1.5 million videos with the attack in the first 24 hours after it has occurred.
Thompson also asked the companies for a briefing on the issue.
A Facebook spokesperson said the company “will brief the committee soon.” Google, Twitter and Microsoft did not immediately respond to requests for comment.
Senator Ron Wyden, an Oregon Democrat who has been critical of Facebook for privacy lapses, said Tuesday that the government be cautious in reining in tech companies for fear of aiding dictators and other bad actors.
Wyden warned against the withdrawal of protection is given in Section 230 of the Communications Decency Act, which gives tech companies are not responsible for what users say on their platform.
“If politicians want to restrict the First Amendment or eliminate the tools that many of the world communicates in real-time, they need to understand that they also move away from the tools that are witnesses of government brutality, war crimes, corporate lawlessness and incidents of racial prejudice,” Wyden said in a statement.
The Electronic Frontier Foundation (EFF), a non-profit organization that advocates for civil liberties in the digital world, warned policy makers last week not to rush to regulate speech on online platforms or otherwise, might be “disproportionate silence” the most vulnerable road users, such as the Egyptian journalist Wael Abbas, who was kicked off YouTube for posting videos on police brutality.
EFF also called for guidelines that urge social platforms to be transparent about how many messages and accounts, and giving users notice and a chance to appeal if one of the messages is removed.
Reporting by Diane Bartz; Additional reporting by David Shepardson and Sarah Lynch; editing by Bill Berkrot