Facebook executives are struggling with the ‘responsibility’ of Facebook

File photo: In this 7 Jun, 2013, file photo, the Facebook “like” symbol is illuminated on a sign outside the company headquarters in Menlo Park, California. (AP Photo/Marcio Jose Sanchez)

(Copyright 2018 The Associated Press. All rights reserved.)

Facebook COO Sheryl Sandberg and Mike Schroepfer, took the stage at the Code Conference in Rancho Palos Verdes last night to discuss the changes the company has made in the wake of the Cambridge Analytica scandal.

Interviewed by the conference moderators Kara Swisher and Peter Kafka, Sandberg said that the company now understands it was too late in responding to the privacy concerns of Cambridge Analytica. “We certainly know that we were too late. We said we are sorry, but sorry that is not the point,” she said.

Instead, it is important to think about the responsibility that it has in any other way. For the past 10 to 12 years, she said, Facebook is focused on building and enabling social experiences, but not enough on the bad that can be done on the platform. “Now we have an understanding of the responsibility that we have, and try to act,” she said.

There is a “fundamental tension” between tools that allow for easy, freedom of expression and safety, Schroepfer added. Facebook wants to facilitate discussions, but also make sure that the platform will not host it or hate messages designed to manipulate elections.

More From PCmag

  • Apple’s iOS 11.4 finally Delivers Messages in iCloud

  • PUBG Files Epic Lawsuit Against Fortnite

  • Amazon Extends Prime benefits to More Whole Foods Stores

  • Pokemon: Let’s Go Can Be Played With a Poke Ball

The Cambridge Analytica Problem

The Cambridge Analytica problems dates back at least 10 years, when the people were talking about wanting to “take data with them,” so Facebook Apis developed to help them do so. In those days, Schroepfer said Facebook was optimistic and focused on the fact that entrepreneurs can make use of the data to develop new applications. He also thought that the people who use those apps understood what was happening.

In 2014, Facebook decided to restrict access to that data, and began a more proactive review of applications. In December 2015, the company that through the messages in the media that Cambridge Analytica had received Facebook data and resold. Why has Facebook more information about this from the press? Once the data was outside Facebook, it can only observe the data, Schroepfer said.

Facebook immediately disabled in the app that scraped the data, and tried to figure out who approached. Zeroing in on the Cambridge Analytica, the company insisted it had removed the data, but that would not be the case, Schroepfer acknowledged.

Now the company is more focused on theoretical way, people could get the data, he said, and has investments in security, control of content and development.

To look back, “we want more controls,” Sandberg said. She noted that, despite the legal guarantees of Cambridge Analytica that it had removed the data, “we should have audited them.” She noted that in recent months the company has made moves to do that, but this is postponed in anticipation of a BRITISH government review that has the highest priority.

In the run-up to the 2016 elections, people were especially concerned about spam and phishing-e-mails, Sandberg noted. The company took measures to prevent such problems, but there is not to see that these different “more insidious threats” came. Now, understand, and Facebook has taken very aggressive steps, ” she said.

Sandberg pointed to the removal of fake accounts and the cooperation with the government to help prevent similar events around other elections, citing work in Alabama, Germany and France. “We show that We are taking steps to make it better,” she said.

They also said that while Facebook is “always” ways to control how users share data with applications, it is now at the top of the News Feed, instead of hard to find. The company is also building new tools on top of these checks.

Didn’t see it Coming

Swisher asked what was wrong with the culture who do not understand, the potential for misuse, points on Facebook Live missteps. Sandberg pushed back and said: “Life is a great example” of how the company fixes things. She noted that, when Live launched, there was “a lot of good, but they were things that were wrong.” So now, the company has the human review of all live within a few minutes. As a result, there are messages taken directly, and when the company intervened and helped people.

Facebook has an open platform, and know that it will never prevent all bad things. But she said that the company could be more transparent and more resources in the creation of a safe community. The company has removed 1.3 billion fake accounts; the publication of its internal guidelines used to evaluate whether the content should be removed, and successfully removes 99 percent of the terrorist content, 96 percent of the adult photos and sexual content, but only 38 percent of the hate before it is reported to the company by the users.

“We won’t get it all,” Sandberg admitted, but Schroepfer said Facebook has made a lot of progress on this than he thought, that it would be able to.

Fake News

On the problem of false news, Sandberg said a lot of that comes from fake accounts; by taking those down, it reduces the problems. Another great source is economically motivated so that the company moves to kick bad actors out of the ad-networks. They also said that the company is working on more transparent, so that you can see the people behind any political or problem posts, which enables people to find more things that are wrong and report them.


Asked about regulation, Sandberg said that the company is already regulated with things like the GDPR. “The question is not whether there is more regulation, but what kind of regulation,” she claimed.

Facebook pays a lot of money and in a lot of complex systems to handle GDPR, and acknowledged that regulation can entrench large companies. And she worried about the unintended consequences, noting that things like Caller-ID were originally regarded as a violation of privacy, so there was regulation to prevent.

Asked whether Facebook is a monopoly and should be broken up, Schroepfer said, there is competition in the market, under the entry of YouTube for video sharing, Twitter for posting the comments from the public, and Snapchat, WeChat, and iMessage messages. “Consumers use the products that they want,” he said, pointing at Facebook is “a very small part” of the total advertising market.

Apple vs. Facebook

Question about Apple CEO Tim Cook’s criticism of the company, Sandberg said: “We strongly disagree with their characterization of our products and business model,” noting that as a free service, Facebook is available for people all over the world.

“We have looked at subscriptions and will continue to do so”, but said that the core of the product is a free service, Sandberg said.

Hearing about the terrible things that happen on the platform, the company has focused on new priorities, Schroepfer said. “It’s not fun, but it is really important work.” He also said that the focus on safety and security is the “biggest cultural shift” he is seen at the company.

Facebook is focused on ensuring the safety, security, and integrity on the platform, but “we understand that it will be an arms race,” and there will be risks, not to see, Sandberg said. Facebook is “making enormous investments on our bottom line, but it is worth the effort.”


This article originally appeared on

Follow us

Don't be shy, get in touch. We love meeting interesting people and making new friends.

Most popular