News

Facebook took 3 billion fake accounts in just six months

connectVideoFacebook the top 5 of the biggest scandals

The top 5 of the biggest scandals to rock social media giant Facebook.

Facebook names more than 3 billion fake accounts from October to March, according to a report released on Thursday.

That eye-popping number is a record, and the tech giant estimates that about 5 percent of the monthly active users are not real. Approximately 2.3 billion people log into Facebook worldwide each month. However, almost all of the fake accounts were pulled down by the firm’s automated systems, the users can even see them.

In a separate blog post, the company said it remains “confident” that most of the activities and people on Facebook are real.

The report gives details about how Facebook took action against a wide range of prohibited content.

AMAZON IS PREPARING A PORTABLE DEVICE THAT READS HUMAN EMOTIONS,’ REPORT SAYS

“For incitement to hatred, we now detect 65 percent of the content that we remove, an increase of 24 percent over a year ago when we first shared our efforts. In the first quarter of 2019, we took a drop of 4 million hate messages, and we continue to invest in technology to expand our capabilities to detect, this content is in different languages and regions,” Guy Rosen, Facebook’s VP of Integrity, said in a blog post.

The company, led by CEO and chairman of the board of directors, Mark Zuckerberg, has the third Community Standards Enforcement Report contains information on nine areas: adult nudity and sexual activity, bullying and sexual harassment, child nudity, and sexual exploitation of children, fake accounts, regulated goods, spam, global terrorist propaganda and violence and graphic content.

Facebook founder and CEO Mark Zuckerberg speaks to the participants during the Viva Technology show in the Parc des Expositions Porte de Versailles, on May 24, 2018, in Paris, France.
(Getty Images)

In six of the policy areas included in the report, Facebook says that it is proactively detected more than 95 percent of the content that it is done without someone to report it.

On Thursday, the Facebook Transparency of the Data Advisory Group — a group of independent experts who last year — released its review of how Facebook enforces and reports on the community standards. The group found that the company’s system for the enforcement of community standards and its review process is generally well designed.

FACEBOOK BACK AWAY FROM THE HARD SELL TO POLITICAL ADS

But the group has 15 separate recommendations on Facebook, including asking for more data to demonstrate the network of the enforcement, and a better explanation of the current policy. The group also said Facebook should make it easier for users to stay current on changes in policies and give them a “greater voice” what content is allowed on the site.

The company announced it was already planned for implementation of a number of the recommendations in future reports, and that for others, it is to look at how best to group suggestions into practice. “For a few, we simply do not think that the recommendations are feasible, given the fact that we review content against our policy, but we are looking at how we can be the underlining of the areas for more transparency of the group with the right calls,” Radha Iyengar Plum, head of product policy research for Facebook, said in a blog post.

“We would like to thank the Transparency of the Data advisory group for their time, their rigorous review and their pertinent recommendations that will help you informed of our efforts as we maintain our standards and ensure more transparency in this work,” the company said in a blog post.

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most popular