Advertisement

Facebook’s terrible 2018

A year to forget for the social-media giant.

2017 wasn't great for Facebook. The company spent most of the year explaining how it exposed 126 million Americans to thousands of Russian-linked pages, which were part of the Kremlin's efforts to meddle in the 2016 US presidential election. As a result, Facebook's reputation took a major hit, and it put the social-media giant under the US government's microscope. Talks of tougher federal regulation suddenly grew in Congress, which doesn't bode well for Facebook. If the company thought 2017 was bad, well, the following year turned out to be a nightmare.

During the first few months of 2018, a huge data-privacy breach affected millions of Facebook users. In March, it was revealed that Cambridge Analytica (CA) had harvested users' personal data without their consent, using a Facebook app it had created called thisisyourdigitalife.

Developers collecting private information, such as email address or birthdate, is not uncommon, but CA was using it without users' knowledge for political research. In April, Facebook announced that CA actually may have had up to 87 million users' data, many more than the number originally reported.

To make matters worse, Facebook allegedly knew what Cambridge Analytica had done in the run-up to the US elections but chose not to disclose it. It wasn't until a joint story broke by The New York Times, The Guardian and The Observer that the company came forward and shared more information about the incident. Facebook said calling CA's actions a breach was "completely false," since the users who were affected had chosen to sign up to use thisisyourdigitalife, but that doesn't make it any better. It also doesn't explain why Facebook waited two years to disclose that any of this had taken place.

Not surprisingly, the Cambridge Analytica don't-call-it-a-breach comment raised even more alarms in Washington, D.C, as well as with other governments around the world. A month after we found out about CA, Facebook co-founder and CEO Mark Zuckerberg went to testify in Congress, before the Senate Judiciary Committee, the Senate Committee on Commerce, Science and Transportation and the House Energy and Commerce Committee.

For the most part, Zuckerberg's congressional hearings were a wasted opportunity: We didn't learn any new, important information about Cambridge Analytica or Russian interference on Facebook. Instead, some US Senators and House Representatives spent most of their time accusing Facebook of political bias against conservatives, rather than focusing on the matters at hand. But, if anything, Zuckerberg being grilled on Capitol Hill at least signaled how the government now has Facebook under close watch. "Unless there are specific rules from an outside agency," Senator Richard Blumenthal (D-CT) told Zuckerberg, "I don't see how you can change your business model to maximize profit over privacy."

Over the course of the year, the scrutiny only got worse for Facebook. Its WhatsApp messaging service was accused of inciting genocide in Myanmar, and the examples of how bad actors can abuse its platforms just kept piling on. And there's no better example of this than the spread of misinformation on Facebook led by right-wing conspiracy theorists and propagandists like Alex Jones.

In July, Facebook came under fire for refusing to ban Jones from the site, despite his history of promoting fake news through his page and that of his InfoWars publication. Eventually, not long after first suspending his account for 30 days, Facebook did shut down four of Jones' pages, citing repeated violations of its hate-speech policies. The company said that more of his content had been reported and, upon further review, it took Jones' page down for "using dehumanizing language to describe people who are transgender, Muslims and immigrants."

Perhaps it took Facebook longer than it should have to ban Jones, but the company made the right move to do so. Even so, the company's approach to fighting fake news seems to be half-hearted. It wants to stop misinformation from spreading, yet it doesn't want to be the "arbiter of truth." Rather than remove fake news stories from the site altogether, Facebook's plan is to reduce and flag them so that users are aware when an article isn't based on facts. But truth shouldn't be subjective, and Facebook would likely be better off taking a tougher stance on misinformation posts.

Just when you thought Facebook was catching a break, it disclosed a security issue that exposed the private data of 29 million users, of which 14 million had very detailed information about their lives stolen. The data breach, which was revealed in October, was caused by a bug in Facebook's site that allowed hackers to access people's name and contact details, as well as their username, gender, birthdate, location, language, relationship status, religion, hometown, current city, education and work. Additionally, the breach gave access to places users had checked in at or were tagged, plus the websites, people and pages they followed, and their most recent searches on Facebook and the type of device they used to login. Yikes.

"We need more than consumer outrage to put checks and balances on Facebook and every other company that profits from our data."

Then, just last week, Facebook revealed yet another bug that let third-party apps access unposted photos of nearly seven million users. Yowza. What's most concerning about the latest incident is that Facebook knew about it in September, and it's just now letting users know. If the company hopes to repair its image, taking three months to disclose a security issue isn't the best way to accomplish that. "Facebook's reputation is in a death spiral, but I'm not convinced it even matters," said Will Potter, a lecturer and fake news expert at the University of Michigan. "Facebook's power doesn't come from trust, but from its ubiquity. My real fear is that we're all growing numb to these violations of our privacy, and we're also bound to the big tech companies who are violating us."

Potter said that society has become too invested socially, professionally and financially in tech companies like Facebook -- and that might be the biggest problem. "We need more than consumer outrage to put checks and balances on Facebook and every other company that profits from our data," he said. "We have to keep in mind that most people either don't know about what Facebook has done, or they don't understand the significance. The people who are outraged are those paying attention to privacy issues, civil liberties, surveillance... and that's not a demographic Facebook would even attempt to win over."

Would government oversight fix Facebook's troubles? That's hard to say. It's too early to tell whether laws like the EU's General Data Protection Regulation (GDPR), which gives users more control of their data and demands that companies disclose data breaches within 72 hours, will keep Facebook under control. Here in the US, though, there are experts who believe regulation won't solve anything. "Regulation of Facebook by the government is a terrible idea," said Paul Levinson, a professor of communications and media studies at Fordham University. "[It's] ipso facto unconstitutional, a clear violation of the First Amendment and its insistence that 'Congress shall make no law ... abridging the freedom of speech or of the press.'"

That doesn't mean Congress won't try, however. "I suppose a bill attempting some sort of regulation of Facebook might be proposed and get passed in the House, but it won't get through the Senate," said Levinson. "And, if it does, the Supreme Court will strike it down, just as it did the Communications Decency Act in the late 1990s." For those who don't know, the Communications Decency Act of 1996 (CDA) was an attempt by Congress to regulate pornographic content on the internet, but the Supreme Court ruled that the censorship law was unconstitutional.

Whatever may happen, being in the minds of governments all over the world isn't a position Facebook wants to be in. Sure, the company is still raking in billions of dollars in revenue every quarter, but its growth is starting to slow down. Maybe that's a sign that Facebook's endless mistakes are, slowly but surely, catching up with it.

Images: Facebook ("L" watermark by Koren Shadmi)