Zuckerberg faced US lawmakers in a marathon Q&A session.
On Tuesday (10 April), Facebook CEO Mark Zuckerberg spoke at a joint hearing of the US Senate judiciary and commerce committees in Washington DC.
For more than four hours, he was questioned about Facebook’s treatment of, and attitude towards, user data in the wake of the Cambridge Analytica privacy scandal.
He fielded a barrage of questions from US politicians, some of which showed that certain representatives were not totally aware of how Facebook’s business model operates on a basic level.
On user data
Zuckerberg tackled the misconception many hold that Facebook sells data to advertisers, saying: “What we allow is for advertisers to tell us who they want to reach, and then we do the placement … That’s a very fundamental part of how our model works and something that is often misunderstood.”
While Facebook does not sell user data to advertisers, people who get their hands on reams of data could theoretically do so. The CEO admitted that Aleksandr Kogan sold the Facebook data to other firms besides Cambridge Analytica, and said the social network is investigating “every single app that had access to a large amount of information in the past”. He added: “If we find that someone improperly used data, we’re going to ban them from Facebook and tell everyone affected.”
He added that user control is the most important principle for Facebook: “Every piece of content that you share on Facebook, you own, and you have complete control over who sees it and how you share it, and you can remove it at any time.”
While there may be tools available for data control and removal, the argument remains that they are difficult to find or comprehend for many of the platform’s users. Although there are simple ways to control who on your friends list can see your content, locating the settings to fine-tune what apps can access is a different story completely.
“Going forward, we’re going to take a more proactive position on this, and do much more regular stock checks and other reviews of apps as well as increasing the amount of audits that we do,” said Zuckerberg, but many privacy advocates feel this is coming too late.
While numerous politicians have been vocal in their desire to see Facebook regulated by government forces, Zuckerberg deflected any major promises to support such regulation in the future.
When asked about what regulations he deemed necessary, he responded: “I’ll have my team follow up with you so that way we can have this discussion across the different categories where I think this discussion needs to happen.”
He added: “I think the real question, as the internet becomes more important in people’s lives, is what is the right regulation, not whether there should be or not.”
Senate commerce committee chair John Thune said: “I’m not convinced that Facebook’s users have the information they need to make meaningful choices.”
The thorny subject of Russia
Zuckerberg described the company’s slowness in identifying Russian information operations as one of his “greatest regrets”. He added that Facebook was in something of an arms race: “As long as there are people sitting in Russia whose job is it to try to interfere in elections around the world, this is going to be an ongoing conflict.”
Tackling hate speech
During the hours-long hearing, Zuckerberg also addressed the role Facebook is playing in the Myanmar conflict and the resulting displacement of Rohingya Muslims, dubbing it a “terrible tragedy” and vowing to do more.
Dispelling the eavesdropping myth
Senator Gary Peters asked Zuckerberg if Facebook is mining audio from mobile devices, noting that the fact so many people still believe this shows how distrusting some users are of the company. Zuckerberg responded with a simple ‘no’.
Leaning heavily on AI
When asked about improvements in moderation tools for Facebook, Zuckerberg mentioned the promise of AI to help the social network quickly tackle hate speech and other unsavoury posts. While this may be true in some cases, critics of the response would say it is just a method of abdicating responsibility for future issues that may arise on the platform.