March 6, 2018

When Facebook says that it’s a tech company and not a media company, a silent shudder echoes across the internet.

It’s like the reaction that, say, Morton Salt would get if the C-suite was to declare that it’s running a logistics company, not a salt company. The audience might nod and applause (as they often do whenever Facebook makes a public statement) but under their breath, they’d be laughing. How can we not be a salt company? That’s literally part of our name.

The same irony applies to Facebook, a company synonymous with social media. Key word: media. But this didn’t stop Facebook COO Sheryl Sandberg from putting her foot in her mouth last fall when she went on record with Axios, confidently declaring that Facebook is not a media company. "At our heart we're a tech company,” Sandberg said. “We don't hire journalists."

Just because most of your labor force is engineers and computer scientists doesn’t mean you’re not a media company. Apparently being a platform for news and information reaching billions of users every day, selling ads and paying companies to use your network to produce original news and entertainment content is not enough to justify relabeling your company to fit into another industry vertical — especially “media” — which has far more amendatory regulations and public responsibilities than a rote Silicon Valley tech company. Or as Erin Griffith writes at Wired, “admitting Facebook is a media company would require Facebook to take responsibility for its role in the spread of fake news, propaganda, and illegal Russian meddling in the U.S. election.”

Outside of the extensive public apologizing, massive legal fees and corporate restructuring that would trigger if Facebook were officially categorized as a media company, the notion of public responsibility is especially sticky. It’s also a big part of the reason we’re still talking about Facebook’s ethical compass more than a year after the Center for Digital Ethics and Policy’s own Don Heider called on Facebook and other social media companies to hire chief ethicists and be the voice of corporate conscience in a morally turbulent media era.

Facebook still hasn’t hired a chief ethics officer, and it won’t because it’s still just a tech company. Plus, an ethics shift for a company with the reach, influence and profitability of Facebook will need more than a Ph.D. and a few administrators to change its culture. Writing for Slate in response to Heider’s call-to-action, Anna Lauren Hoffman, a professor with The Information School at the University of Washington in Seattle, pointed out two key problems in the chief ethicist approach. First, the idea that an internal ethics team could serve as “a panacea for all possible ethical problems” is based on a monolithic view of Silicon Valley principalities. The concept ignores “internal dissenters and external advocates already in the trenches, already grappling with key ethical issues.”

Second, the chief ethics officer approach is based on the naive assumption that internal processes could be strong enough to combat the outside commercial and political forces that demand maintenance of the status quo. Like an incompetent internal affairs team on a B-grade police drama, they might be noble and high-minded, but they drink from the same coffee machine as everyone else. And as anyone who watches these shows will tell you, real change only happens when the outside agents come in.

Unfortunately for us, life isn’t a B-grade police drama. There’s no such thing as fairy tale justice and freeze-frame endings. Real life is messy and vile, and our modern problems are infinitely complicated, especially when we’re talking about Facebook’s current existential dilemma. For one thing, they’re not like a legacy news or media company, such as the New York Times, which had the advantage of old money, urbane authority and analog culture to establish their ethical differentiation. For another thing, Facebook stands in an entirely unprecedented industry space; more like a psychical kingdom than a digital front.

So as Facebook sees it, labels like “media company” don’t really fit. Facebook is a platform. People use it like a modern commonplace. Though instead of handwritten recipes, quotes, measures and math equations, Facebook has targeted ads, Twitter plugins and Logan Paul videos. And as with any emerging business, Facebook has struggled with ethical complications for years; everything from live-streamed assaults and privacy violations to international free speech restrictions (problems that may sound familiar to media companies). But lately it’s the nagging and pervasive problem of “fake news” that has driven the ethical thorn deeper into Facebook CEO Mark Zuckerberg’s side.

If Facebook was categorized as a media company, its fake news problem would have spelled the beginning of the end, especially for the integral role the network played in disrupting the 2016 U.S. presidential election. But because it’s a tech company, it’s skirted responsibility, much like a gun shop owner who says that what his customers do with his weapons is not his problem; he just sells guns. Facebook is just a social platform. What people do with it is their problem.

This rhetorical twist likely explains Zuckerberg’s latest tactic in the company’s ongoing moral battle against fake news: Letting the users themselves decide what’s trustworthy and what’s not. In a recent Facebook post, Zuckerberg writes:

“The hard question we've struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you -- the community -- and have your feedback determine the ranking.”

You won’t hear this type of moral ambivalence from most legacy media companies. And in many ways, this “well, what do you think” type of thinking is typical of Facebook’s philosophical provenance. But it also reflects Facebook’s own ethical uncertainty. A chief ethics officer might be able to pose the question in more nuanced language, but it’s increasingly clear that the solution is cultural, not institutional. This is to say, for Facebook to change and take responsibility, culture must change first.

Maybe the digital marketing cliché that “every company is a media company” isn’t far from the truth — or what the truth should be. Because when a company says yes, we’re a media company, it willingly takes on the ethical responsibility commensurate with such categorization. Few companies in the modern marketplace can operate or be successful without a strong sense of mission and an articulated message of corporate social responsibility. Even Koch Industries has one. And according to recent research from Information Development, there are emerging industry tools and a growing body of literature that media companies can use to assess the impact of their corporate responsibility programs; namely, through analyzing credibility, usefulness and fairness.

But corporate responsibility still isn’t a fix for broken institutions. In many cases, it’s there for show; And while it’s a useful metric for assessing where a company stands on the issues, it’s not an ethical salvo. The more Facebook waffles and defers on taking a stance as a media company, the more damage it will do and the more culturally outmoded will become. If Facebook insists on being a tech company at the heart, maybe that tells us all we need to know about what the heart of a tech company looks like.

Benjamin van Loon
 is a writer, researcher, and communications professional living in Chicago, IL. He holds a master’s degree in communications and media from Northeastern Illinois University and bachelors degrees in English and philosophy from North Park University. Follow him on Twitter @benvanloon and view more of his work at benvanloon.com.