Facebook's content policies are renowned for being blunt instruments. Its rules strictly prohibit nudity, hate speech, violence, graphic images and 'offensive' material - rules which have brought the banhammer down on groups like cancer charities (for a stylised, cartoon image of breasts) and news organisations (for posting the infamous Vietnam War 'napalm girl' photo).
To date, Facebook has dealt with specific criticisms of its policies by adding carve-outs; for example, both breastfeeding and post-mastectomy photos are explicitly permitted. In a pivot from that approach, Facebook now proposes to apply a 'public interest' exception to its usual content rules, the idea being that content 'in the public interest' can always be published.
The public interest is obviously a familiar concept in intellectual property and public law space: a vague, nebulous idea of "stuff that is good for people to know / see" which defies definition. The scope and nature of the public interest is constantly litigated, particularly where interests conflict. Do principles of free speech outweigh the need to protect victims of hateful rhetoric? Is the value of criticism and critique of copyright works greater than the need to incentivise the creation of new works?
Of course, content regulation is a catch-22. Bluntly applying a content policy might shield an organisation from claims of politicisation, but means that valuable speech will be limited. Giving the reigns to a millennial, Silicon Valley-dweller might limit unfair take-downs, but creates an inevitable risk of bias or, worse, censorship.
But in light of Facebook's international audience, we have to ask serious questions about a subjective, public interest-based approach:
1. How can content hosts sensibly determine what the global public interest is, when the topic divides even fairly homogenous societies? Does free speech get the priority offered by the First Amendment to the US Constitution, even when the poster is French? How does Facebook address political material - such as that arising out of the Australian gay marriage debate - which may be offensive in one jurisdiction but a live social issue in another?
2. How does a content host apply a public interest assessment evenly across issues and groups, particularly when the cause (as in the case of breast cancer) informs the acceptability of content? Are photographs from naked parades acceptable, or does that depend on the cause being championed? Is a depiction of historical violence always acceptable, or should it be censored if it doesn't have a accompanying positive narrative?
3. How should regulators respond when a private organisation has unprecedented control over what content is spread in the public arena? Eras past saw the creation of organisations such as the Press Council and the Broadcasting Standards Authority to regulate media monopolies; how can online media be subject to regulation, particularly when they cross jurisdictions?
For the most part those questions are unanswerable - even with reference to Facebook's previous policy. But a policy dominated by the decisions of a few individuals about what's good for 7 billion people is potentially dangerous. The first step for Facebook is to embrace transparency by providing users and governments with information about how it assesses the public interest, and with reasoning as to why content falls inside or outside the bounds of acceptability. That at least allows debate over the merits of employees' decision making and introduces some accountability. In a world where Facebook decides whether people have a soapbox or are silenced, transparency should be a bare minimum.
“In the weeks ahead, we’re going to begin allowing more items that people find newsworthy, significant, or important to the public interest — even if they might otherwise violate our standards,” Kaplan and Osofsky wrote. “We will work with our community and partners to explore exactly how to do this, both through new tools and approaches to enforcement. Our intent is to allow more images and stories without posing safety risks or showing graphic images to minors and others who do not want to see them.”