Facebook is arguably the most powerful media channel in the world. Millions of businesses depend on it (and its precious data), but the company has betrayed 2.2 billion users and it will take a lot to earn back their trust.
By now you’ve heard about the data leak that resulted in 50 million users having their personal information exploited by Cambridge Analytica, a firm that was later hired by the Trump campaign.
It’s important to note that data breaches are inevitable in tech. Bad actors will occasionally find a way to hack the system and expose private data. If security measures fail, you inform the affected users and do everything in your power to ensure it never happens again.
Facebook didn’t inform anyone. And the worst part is that it wasn’t even technically a “breach.” Take it from Facebook’s former VP of Ads himself.
Cambridge Analytica, a firm that says it uses data to “change audience behavior,” essentially used the API as intended — before Facebook decided to limit how third party developers could capture user data in this manner.
It wasn’t until earlier this month (when the story resurfaced) that Facebook suspended them from its platform. (Interesting side note: there is no mention of “50 million” in the announcement.)
Unfortunately, reports say that turning a blind eye to data sharing used to be a common occurrence inside Facebook — despite the fact that academic researchers had started warning people about it in the early 2010s.
Mounting concerns over the debacle have already led to some major changes at the social media giant. Alex Stamos, the chief information security officer who fought for greater transparency into the election meddling, will be leaving the company later this year. And I’m sure the timing of that story did not please Mark Zuckerberg, Sheryl Sandberg and the other executives.
At this point, I think it’s safe to assume that Zuck is perspiring more than when Kara Swisher interviewed him at the D8 Conference in 2010.
A week ago, we finally heard from him and Sandberg. And while each of their comments had fragments of authenticity, the response strategy felt overly calculated when Sandberg shared Zuck’s post four minutes later and said the exact same thing in her opening paragraph: “We have a responsibility to protect your data — and if we can’t, then we don’t deserve to serve you.”
We get it. Nicely crafted. But actions speak louder than words. Not to mention the fact that they should’ve responded more quickly than five days later.
Oh wait. Did I say five days? I meant two years and three months.
If Facebook’s original mantra was to “move fast and break things,” it has effectively accomplished just that.
“The episode fits an established pattern of sloppiness towards privacy, tolerance of inaccuracy and reluctance to admit mistakes.” — The Economist
Congress wants Zuckerberg to testify and there are many valid reasons why he should accept their invitation. If/when this occurs, it’s certainly a step in the right direction. Hopefully the first of many more.
I genuinely believe Zuck when he says that he “feels really bad” about the situation. However, it’s time to start exploring far more robust, proactive solutions than anything they’ve done in the past.
At Facebook’s size, it’s going to be extraordinarily difficult to prevent this from happening again. The business model practically encourages it since they already know everything about you. And as noted by The Atlantic, The Outline and others, the problem is not limited to Facebook by any means.
In all fairness, we gave up our data in the first place. We just didn’t know how it could be leveraged. But frankly, most people don’t know or care, which means billions will continue using Facebook, Instagram, WhatsApp and Messenger because everyone is addicted, remember?
The problem with trying to scale ethical data usage is twofold: 1.) Consumer behavior is unlikely to change and 2.) Facebook can’t afford to take an aggressive enough approach without killing its entire business.
“Nobody sets out to destroy the environment; they just want to make synthetic fibers or produce industrial chemicals. The same goes for our giant tech platforms. Facebook never expected to be an engine that destroys America.” — Bloomberg Businessweek
In simple terms, the big data movement escalated more quickly than anyone could’ve imagined. Facebook just owns some of the most valuable data in the world, which makes it a prime candidate for malicious activity.
One of the more intriguing proposals for a solution came from a recent piece by Paul Ford in Bloomberg Businessweek: the creation of a “digital protection agency.” Just as we have the EPA, this federal entity would have full transparency and access to the business practices, data resources and algorithms used by the nation’s most powerful tech companies.
Given the current circumstances, federal regulation seems beyond reasonable.
Of course, no matter how many people join the #deletefacebook movement (don’t forget Instagram is still Facebook, btw), the company isn’t going to vanish into thin air. A sharp decline in its stock price over the past week is a blip on the radar compared to its explosive growth since 2012.
However, for the first time in years, the bottom line isn’t the top priority.
Today, Facebook’s only option is to address what’s broken, find a way to restore trust and move faster than it has ever moved before.