connectVideoFacebook claims no one reported New Zealand shooter to stream live during the broadcast
The social media giant says that it doesn’t block the shooter’s livestream of terror, because no one reported the content to 12 minutes after the broadcast was ended; Jonathan Hunt reports from Los Angeles.
A French Muslim group is filing a lawsuit against Facebook and YouTube over the viral spread of the video of the New Zealand mosque attack on 14 March.
The French Council of the Muslim Faith, a government agency that stands for millions of Muslims in France, said that the prosecution of the tech companies for “broadcasting a message with violent content complicity in terrorism or of a nature likely to seriously violate human dignity, and is liable to be seen by a minor.”
In France, these types of violations can be punished with three years imprisonment and a fine of 75,000 euros, The Guardian reported.
FACEBOOK LIVE SPARKS CONTROVERSY, RESULTS IN THE STUDENT’S DRAINING GUILTY PLEA
Both companies have to deal with a backlash over their response to the terrorist attack that left 50 dead and 50 injured when two mosques in Christchurch.
A link to a copy of the video was posted on the messages board 8chan, which is part of the reason why users were able to make small adjustments and changes — that helped the video from the tech platform AI detection systems.
Google-owned YouTube has said that the artificial intelligence software is not working as well as it had hoped. Facebook, which said it removed 1.5 million videos in the first 24 hours after the attack (1.2 million before they were seen by the users), also struggled to stop the spread of the New Zealand video.
According to Facebook, it does not receive a user report about the first livestream video up to 29 minutes after the start.
‘MARS ON EARTH’ FOR GOOGLE
Last week in New Zealand’s three main telecom released a scathing letter to the ceos of Twitter, Google and Facebook, calling on them to be more pro-active steps to remove this type of content in advance.