Breaking his silence for the first time since reports last weekend that the political data firm Cambridge Analytica improperly obtained information on 50 million Facebook users, Mark Zuckerberg weighed in with a post on Facebook Wednesday afternoon. In a 937-word explanation he blamed himself, but he also blamed Cambridge University researcher Aleksandr Kogan, who mined the data then lied about destroying it.

I started Facebook, and at the end of the day I’m responsible for what happens on our platform. I’m serious about doing what it takes to protect our community. While this specific issue involving Cambridge Analytica should no longer happen with new apps today, that doesn’t change what happened in the past. We will learn from this experience to secure our platform further and make our community safer for everyone going forward.

With Facebook facing investigations by attorneys general in both Massachusetts and New York, Zuckerberg assured one and all that the problem has already been fixed. In fact, he says it was done in 2014, the year after Kogan deployed his app, which gathered information on 300,000 people but then improperly linked to their friends.

In 2014, to prevent abusive apps, we announced that we were changing the entire platform to dramatically limit the data apps could access. Most importantly, apps like Kogan’s could no longer ask for data about a person’s friends unless their friends had also authorized the app. We also required developers to get approval from us before they could request any sensitive data from people. These actions would prevent any app like Kogan’s from being able to access so much data today.

The statement goes on to create a timeline of events from the company’s founding in 2004 up to present day and outlines three solutions: closer monitoring of data-mining apps, educating users, and restricting access for app developers.

Kogan’s personality-predictor app harvested enough information to interest Cambridge Analytica, a firm part-owned by GOP donor father-daughter duo Rebekah and Robert Mercer, who also fund Trump super PACs. Among other right-wing entities, they support Breitbart News, which might explain why Steve Bannon once served as vice president of Cambridge Analytica.

Earlier this week, CEO Alexander Nix was caught on hidden camera bragging about the firm’s illicit acts and has been suspended as a result. Writing for the Mercury News, Rex Crum and Levi Sumagaysay sum up the worst month in Facebook’s 10-year history.

The Trump campaign used Cambridge Analytica early on, and the former employee of the firm, Chris Wylie, has said the firm’s data and analysis helped shape the politically divisive tone of the Trump campaign.

Among the consequences so far: The Federal Trade Commission is reportedly investigating whether Facebook violated a consent decree that was part of a privacy settlement the company reached with the agency in 2011. Violations could result in fines of $40,000 a day per violation.

Then, the privacy issue du jour was that the FTC had accused Facebook of deceiving users by telling them certain information could be private, “then repeatedly allowing it to be shared and made public,” the FTC said when it announced the settlement—which bars the company from misrepresenting the privacy or security of user information.

The Cambridge Analytica mess comes as the company is still dealing with the fallout from its role in helping spread fake news and propaganda by Russian trolls, an idea Zuckerberg scoffed at two years ago—in fact, he called it “crazy.”

Kogan told The Associated Press he had no idea he had broken any laws and was assured by Cambridge Analytica that his actions were perfectly legitimate.

“My view is that I’m being basically used as a scapegoat by both Facebook and Cambridge Analytica,” he said. “Honestly, we thought we were acting perfectly appropriately, we thought we were doing something that was really normal.”

Authorities in Britain and the United States are investigating the alleged improper use of Facebook data by Cambridge Analytica, a U.K.-based political research firm. Facebook shares have dropped some 9 percent, lopping more than $50 billion off the company’s market value, since the revelations were first published, raising questions about whether social media sites are violating users’ privacy.

Market value is likely to be further impacted by a raft of lawsuits sure to follow. Bryan Menegus reports in Gizmodo on what could be the start of a trend:

Salting that wound is a class-action lawsuit, filed today in California’s Northern district, on behalf of the social media giant’s shareholders.

Fan Yuan, the shareholder who filed the suit, accuses Facebook of making “materially false and/or misleading” claims about the company’s handling of user data—meaning the instances where Facebook or Zuckerberg himself addressed privacy and security issues and failed to disclose the ongoing Cambridge Analytica fiasco. …

Yuan’s suit—which represents a class of unknown size—alleges that failures to disclose the ongoing situation with Cambridge Analytica has reduced the value of shares he and others hold in the company. He’s asking to award damages unspecified and any other amount the court deems proper.

Zuckerberg followed his statement with further efforts to stop the hemorrhaging by sitting down opposite CNN tech correspondent Laurie Segall on Wednesday night.

“This was a major breach of trust and I’m really sorry this happened,” he said in the interview on CNN. “Our responsibility now is to make sure this doesn’t happen again.”

Zuckerberg added that he would be open to testifying before Congress.

“What we try to do is send the person at Facebook who will have the most knowledge,” Zuckerberg said. “If that’s me, then I am happy to go.”

In an earlier interview with Recode, he admitted that opening up the Facebook network to third-party developers a decade ago was a mistake.

“Frankly, I just think I got that wrong,” he said.

“There was this values tension playing out between the value of data portability—being able to take your data and some social data, the ability to create new experiences—on one hand, and privacy on the other hand,” he explained. “I was maybe too idealistic on the side of data portability, that it would create more good experiences—and it created some—but I think what the clear feedback from our community was that people value privacy a lot more.”

A #DeleteFacebook hashtag is building momentum as people consider deleting their Facebook accounts (though removing the ubiquitous social media platform could pose a challenge).

“I don’t think we’ve seen a meaningful number of people act on that, but, you know, it’s not good,” Zuckerberg told The New York Times. “I think it’s a clear signal that this is a major trust issue for people, and I understand that. And whether people delete their app over it or just don’t feel good about using Facebook, that’s a big issue that I think we have a responsibility to rectify.”

Now, we’ll see what steps Facebook takes to protect user data.

Your support matters…

Independent journalism is under threat and overshadowed by heavily funded mainstream media.

You can help level the playing field. Become a member.

Your tax-deductible contribution keeps us digging beneath the headlines to give you thought-provoking, investigative reporting and analysis that unearths what's really happening- without compromise.

Give today to support our courageous, independent journalists.

SUPPORT TRUTHDIG