Tuesday, February 23, 2021

Ethics and governance in the newsfeed: the New Yorker's "Inside the Making of Facebook's Supreme Court"

It would be difficult to deny the influence and affect that Facebook has on our digital world. With roughly 2.8 billion active users as of the fourth quarter of 2020 (Statista), Facebook allows users the opportunity to connect with and befriend each other, buy and sell goods and services, meet other people with likeminded interests or hobbies, advertise their business, discuss news and world events, share cat pictures, use video, audio and text chats, and more with people from all over the world. Long before Facebook was posting those level of quarterly numbers, however, came the concern of how would the website handle content that was posted that violated their terms of service agreements or even national or international law. (The latter additionally evolved to a more nuanced question -- if posted content violated the law in one country but was acceptable in another, how should that type of content be moderated on the site?) Content could be flagged by algorithms for containing certain objectional content; users could flag material they spot for review; site moderators could comb through, looking for questionable content. Complicated cases or viral content could be escalated up the chain of command, even up to Mark Zuckerberg, who was noted in the New Yorker piece as having said that he ends up dedicating a large portion of his time evaluating high-profile posts that have come under review for content and trying to decide whether or not they should be removed from the site. After Professor Noah Feldman suggested a "Supreme Court" type "quasi-legal" body be added to Facebook to grapple with the more thorny or high-profile concerns around free speech, Zuckerberg suggested that he present it to Facebook's corporate board. Though they had concerns, Zuckerberg defended and ultimately unilaterally ruled to bring the Oversight Board into existence.

With a body of members pulled from many high-profile places -- a former Prime Minister, a Nobel Peace prize winner, professors, White House officials, activists, journalists, and more -- they began to attend trainings and build bonds within the team. Only a few weeks after the board's establishment, however, the Black Lives Matter protests began sweeping across the country, and along with it came numerous controversial posts, including those from the now former president Donald Trump. The board was not even evaluating cases yet, but they debated a particularly controversial post from Mr. Trump that had been removed by Twitter but not by Facebook. In October of 2020, they began evaluating cases from a small percentage of the user base as a way to test out the program. Barely three months later, a deluge of politically-electrified posts running the gamut of the spectrum exploded surrounding the events of the January 6th, 2021 United States Capitol attack. Facebook elected to suspend Mr. Trump's account indefinitely, followed quickly by his permanent ban from fellow social media giant Twitter. A few weeks later, after the inaguration of current president Joe Biden, Facebook opted to submit Mr. Trump's case to the board for their review. According to Politico, Mr. Trump has submitted an appeal, and the case is currently open to the community for comment, with the board opting to extend the comment period due to high interest in the case. They are set to rule on the case in April of this year (Politico).

I find Facebook's Oversight Board to be a very interesting concept. As Kate Klonick from the New Yorker noted, there has not been anything quite like this in recent times -- at least, that I have encountered. Given the notable influence Facebook has as a social platform and tech giant, I think that the idea of separating the ethical, legal and sometimes philosophical challenges from the business and tech side of things makes some sense. Having a dedicated team of people with the knowledge to debate and deliberate on potentially objectionable or inappropriate user content, and having their powers, responsibilities and duties separated from the interests and responsibilities of the business of Facebook allows the decisions to be at least somewhat disentangled from shareholder obligations, business interests and a CEO who does have unilateral power to do things like create an oversight board despite his corporate board's misgivings. This high-profile early case may be this brand-new oversight board's undoing, or it may be just the thing to establish it as a serious tribunal of minds deliberating some of the newest ethics and free speech examples of our time.

Klonick, K. (12 February, 2021). "Inside the Making of Facebook's Supreme Court". The New Yorker.

No comments:

Post a Comment

Netlytic

I used Netlytic for the first time a few days ago for our assignment. I haven’t dealt much in external social media analytics before this. ...