News & Media

What's Gone Wrong with Social Media & What Can We Do About It?

ALL NEWS

Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy published an article advocating for transparency by social media platforms.

By Susan Tillotson Bunch

Last week, Harvard Kennedy School’s Shorenstein Center on Media, Politics and Public Policy published an article (Harvard Kennedy School Article March 27, 2018) advocating for transparency by social media platforms in the form of API’s that would require the disclosure of algorithm-related data in 3 broad categories: all public posts (including statistics on engagement reach, numbers, demographics, trending stories, key influencers), advertising campaigns (identities of purchasers, groups and content), and censored content (content, origins, duration, reach & engagement before deletion). 

The authors justified each category of proposed mandatory disclosure: public posts data would reveal the “malignant actors” who “spread misinformation and manipulate users” and thus allow platforms to be held “accountable” for responding to “harmful trends in real time.” Requiring ad data would enhance “our ability to detect false advertising, political smear campaigns, and election manipulation.” Finally, public censorship data will “pressure” social media companies to more quickly delete content violating their policies, and “verify” that they aren’t “censoring content” beyond their policies.

 While the goal of transparency has a superficial appeal, it is based entirely on the assumption that platform consumers must be protected from their own biases, short attention spans, and intellectual laziness.  The authors argue quite convincingly that the social media platforms (tech companies at their core) achieved stunning success by adapting content to an increasingly targeted audience through algorithms that identify trends in engagement. 

Conceding the human tendency to consumer content that is controversial or stimulating over its counterparts, i.e., the old adage “if it bleeds it leads” comes to mind, the authors nevertheless decry the lack of incentive for platform algorithm engineers to “identify and prioritize social well-being,” without offering guidance on who would be tasked with such a qualitative judgment. Personalized content is a myth, they argue, because users are inherently pure minded and would be drawn to “better” content if they had time to reflect; the algorithms, they essentially argue, exploit the impulsive reactions instead.

Extending the victimization theme, the article compares social media platforms to vices (increased engagement equals addiction) and other harmful activities such as hedge funds to justify a platform-exclusive voluntary regulatory framework. However, the suggested approach, while no doubt advocated with best of motives, addresses only the symptoms and not the illness. 

Rather than task the platforms with protecting consumers from the consequences of their lack of curiosity, their liaises-faire attitude towards credibility, their disinclination to read past a sensationalized headline a la tabloid-style, why not suggest that readers engage in a bit of introspection? Better yet, rather than equating Facebook with a state agency who should be transparent, why not engage with the content itself – the answer to bad speech is not to chill speech; instead, add good speech to the marketplace of ideas.

 Unfortunately, the proposed approach does not raise the level of discourse. It condescends to the lowest level and thereby accepts it as the standard, failing to address the accountability of the citizenry in informing their exercise of decisions, whether voting, listening, clicking, “unfriending” or adding their voices to the mix. Does that really “identify and prioritize social well-being”?

Susan Tillotson Bunch, attorney with Thomas & LoCicero,  has been designated as a Certified Information Privacy Professional/United States by the International Association of Privacy Professionals.  The CIPP/US designation is awarded to individuals that demonstrate, via experience, training and testing, mastery of privacy related laws and concepts, across all areas, for private sector entities in the United States.