WASHINGTON (Reuters) – YouTube fetched more than 58 million videos and 224 million notes during the third quarter, on the basis of violations of the policy of the unity of the Alphabet, Inc.’ s (GOOGL.O) Google said on Thursday in an attempt to show progress in the suppression of the problem-content.
FILE PHOTO: Silhouettes of the users are seen next to a projection screen of the Youtube logo, in this picture, image March 28, 2018. REUTERS/dado Ruvic/Image/File photo
Officials and interest groups in the United States, Europe and Asia are pressuring YouTube, Facebook Inc (FB.O) and other social media services to quickly identify and remove extremist and hateful content, which critics have said incitement to violence.
The European Union has proposed that online services should face steep fines unless they remove extremist material within an hour of a government to do that.
An official of the indian Ministry of home Affairs, speaking on condition of anonymity on Thursday, said social media companies had agreed to the address of the authorities requests to remove objectionable content within 36 hours.
This year, YouTube began issuing quarterly reports on the enforcement.
As with previous quarters, most of the deleted content, spam, YouTube also said.
Automated detection tools to help YouTube quickly identify spam, extremist content and nudity. In September, 90 percent of the nearly 10,400 videos removed for violent extremism or 279,600 videos will be removed for the safety of children’s issues received less than 10 views, according to YouTube.
But YouTube faces a greater challenge with material to promote hateful rhetoric and dangerous behavior.
Automated detection technologies for these policies are relatively new and less efficient, so YouTube is based on the users about potentially problematic videos, or comments. This means that the content can be viewed on a large scale before it is removed.
Google added thousands of moderators of this year, expanding to more than 10,000, in the hope of a revision of the user reports more quickly. YouTube declined to comment on the growth of the plans for 2019.
The described pre-screening each video as feasible.
The third-quarter removal of the data for the first time appeared the number of YouTube accounts Google shut down for the three violations of the policy in 90 days or the committing of what the company is found to be a flagrant violation, such as uploading child pornography.
YouTube removed approximately 1.67 million channels and all of the 50.2 million videos that were available of them.
Almost 80 percent of the channel is closed due to spam-upload, YouTube said. About 13 percent of the involved nudity, and 4.5% for the safety of children.
YouTube said users post billions of comments each quarter. It declined to disclose the total number of accounts that uploaded the videos, but said moves were also a small fraction.
In addition, about 7.8 million videos were removed individually for policy violations, in line with the previous quarter.
Reporting by Paresh Dave; Additional reporting by Sankalp Phartiyal in Mumbai; Editing by David Gregorio