By Nicole Prieto, Editor-in-Chief
In her surrealistic graphic novel Temperance,[1] Cathy Malkasian posits an unusual thought experiment: Can a community enclosed in a ship of stone — buoyed along a fictional sea of fire and convinced it is surrounded by “enemies” that do not exist — thrive unperturbed for 30 years?
Accustomed to strict routine, the people of Blessedbowl do not question the impossibility of a stone ship setting sail from the lip of a grassy cliff, nor the absence of proof that great fires lick the hull of their immovable fortress. They do not question that they must wear hats at all times or that they must be wary of the full moon and birds (and the news they carry). Rather, they rely on the colorful morning dispatches reported by their absent leader’s purported daughter, Minerva. She tells them of her Pa’s war efforts against their unseen enemy and of his prophetic promises that, soon, the Blessedbowlers will join him in his final battle against their foe.
But unknown to anyone, Pa’s dispatches are lies. In tandem with her dramatic oration, they are part of Minerva’s desperate ploy to keep the fabric of her people’s lives from fraying apart in the wake of a world-shattering truth: Pa — who stole away the Blessedbowlers from distant lands through unfounded fears of a fire-stoking enemy, and who enlisted their hands in constructing the stone prison they now call home — had abandoned his nascent society on the eve of the Bowl’s completion. Only Minerva knows that he left them to die in the wake of their ruined, former lives. And it is only through her stories that they have continued on despite it.
Published in 2010, Temperance remains an apropos work in the decade since social media has transformed the way we consume and proliferate news.[2] “Fake news” has not lost its accusatory power after the dust settled from the 2016 presidential elections. We continue to see headlines regarding the role of Russian trolls in sowing divisiveness among Americans through social media channels.[3] As major media outlets contend with public distrust,[4] online platforms abound for amplifying unvetted facts.[5] From indictments to public outcry, we have witnessed the consequences attendant to the unfettered spread of instantaneous information. The question remains how to address it all moving forward — and what the role of the media ought to be in making headway.
Undoubtedly, the stories we tell ourselves matter; how and where we tell them can make all the difference. While fiction might not have literal answers to the social and legal challenges in our “too much information” age, it can perhaps serve as a window for approaching the problems underlying our complex relationship with media sources today.
The Meaning of “Media”
Asking what “media” means is worth positing in an era where the lines between “real” and “fake” news — and the organizing forces behind them — remain in uncomfortable proximity in online spaces. The Oxford English Dictionary, simply, gives one sense of “media” as “[t]he main means of mass communication, esp. newspapers, radio, and television, regarded collectively; the reporters, journalists, etc., working for organizations engaged in such communication.”[6] Likewise, “social media” refers to “websites and applications which enable users to create and share content or to participate in social networking.”[7]
For analogy, New Yorker writer Andrew Marantz likens “social media” to a party:
Last year, the Supreme Court heard a case about whether it was constitutional to bar registered sex offenders from using social media. In order to answer that question, the Justices had to ask another question: What is social media? In sixty minutes of oral argument, Facebook was compared to a park, a playground, an airport terminal, a polling place, and a town square.
It might be most helpful to compare a social network to a party. The party starts out small, with the hosts and a few of their friends. Then word gets out and strangers show up. People take cues from the environment.[8]
From this, it is easy to distinguish the role of social media giants from those of traditional media outlets: The former, whether as part of their built-in features[9] or at their users’ discretion, can collect and share news information actually gathered by the latter. In effect, social media outsources journalistic responsibilities while still indirectly benefiting from reportage through increased user engagement. But while intentional disinformation is abhorrent to journalism and public trust, there are perhaps less severe implications for the platforms that do not present themselves as actual “publishers.”
As TechCrunch writer Natasha Lomas points out, employing people to make editorial decisions would make social media companies “the largest media organizations in the world” — requiring not only “trained journalists to serve every market and local region they cover” but also “invit[ing] regulation as publishers.”[10] Still, Facebook’s interest in sorting fake news from genuine reporting is a public relations issue that it has an interest in redressing — particularly in light of the fallout it has faced with the collection of private user data by Cambridge Analytica.[11]
The tension against the propagation of disinformation is not limited to social media; it implicates any content-generating platform with the potential for reaching an uninformed audience. The New York Times reported on YouTube’s announced plan to utilize Wikipedia in fact-checking its video content in an effort to mitigate the impact of conspiracy theories, much to Wikipedia’s own surprise.[12] The proposal presented another thorny issue: “Can’t anyone edit Wikipedia,” writes John Herrman, “including the conspiracy theorists themselves?”[13]
YouTube relying on Wikipedia entries, in effect, would mean relying on the narratives crafted by whoever edited those pages. It would mean placing trust in whichever unvetted user decided to alter a given page — sometimes merely for humorous effect,[14] but perhaps also for malicious purposes. And much like the relationship between news reportage and social media platforms, the weight of editorial responsibility would seem to fall more on Wikipedia than YouTube itself.
Mitigating Speech
One brute force method of mitigating the impact of false speech is by deleting it entirely. Social media platforms’ removal of offending content from their platforms is not exactly a violation of free speech as we understand it in the context of the First Amendment.[15] Simply put, Facebook is not the government.[16] The discretion to remove content is not only within the rights of a platform (and almost guaranteed to be somewhere in its published policies[17]), but it may also be necessary for its long-term self-preservation. Reddit, the billed “front page of the Internet,”[18] decided to ban severely racist or violent subreddit communities, for instance, even if it meant running up against user backlash or accusations of speech suppression.[19]
“We just took away the spaces where they liked to hang out,” Reddit CEO Steve Huffman told The New Yorker, “and went, ‘Let’s see if this helps.’”[20]
With respect to legal recourse, false information that leaves victims in its wake can result in civil or criminal liability, such as with torts or deceptive practices.[21] Setting discussions about elections interference aside, fake news may have severe real-world consequences beyond reputational or political damage. Wired recently reported that disinformation spread through WhatsApp has carried consequences in the sphere of public health, where “rumors of fatal vaccine reactions, mercury preservatives, and government conspiracies” have interfered with the vaccination of Brazilians in the wake of a yellow fever outbreak.[22]
“WhatsApp is especially popular among middle and lower income individuals there, many of whom rely on it as their primary news consumption platform,” writes Megan Molteni. “But as the country’s health authorities scramble to contain the worst outbreak in decades, WhatsApp’s misinformation trade threatens to go from destabilizing to deadly.”[23]
Removing content or banning “bad actors” may not be catch-all solutions. At issue “are the core features of the technologies” themselves, where perhaps the only foolproof solution is to either turn those features off or remove their availability to specific users. “Figuring out which users fall into that category is a value judgment—the type of value judgment that the libertarian ethos of tech companies has left them very reluctant to make,” Molteni writes.[24]
Lomas criticizes social media algorithms that operate to purvey disinformation to begin with:
Social media’s filtering and sorting algorithms also crucially failed to make any distinction between information and disinformation. Which was their great existential error of judgement, as they sought to eschew editorial responsibility while simultaneously working to dominate and crush traditional media outlets which do operate within a more tightly regulated environment (and, at least in some instances, have a civic mission to truthfully inform).[25]
Increased legal regulations,[26] more internal access and content policing, and increased transparency in how social media algorithms operate[27] are among the calls to action for contending with the impact of fake news. Regulations proposed in Europe would even demand more than social media platforms’ own transparency about advertising space purchasers, instead extending “collaboration between data protection authorities and other regulators to safeguard the rights and interests of individuals in the digital society.”[28]
But the fruits of those efforts, if realized, are bound to take time. For now, consumers are left with the task of self-evaluating the information displayed on dashes, feeds, timelines, or even our private messages. This is certainly a challenge in a digital age where stepping away from our phones is hardly practical solely to avoid false information.
Perhaps one solution is to treat shared news much like suspicious spam emails, where we subconsciously know that an errant click or tap could wreak havoc on our devices if we fail to pay attention to warning signs like blatant misspellings or suspicious sender information.[29] But with fake news, we are not contending with a temporary misrepresentation that could threaten ransomware installation; we are fighting against the long-term effects of disinformation on our very psychology as human beings.
The Power of Narrative
The power of confirmation bias is overwhelming. In one Stanford study, two groups of students with opposing views about capital punishment and its effect on crime deterrence were presented with two sets of fake studies: “One provided data in support of the deterrence argument, and the other provided data that called it into question.”[30] Both rated the credibility of either set in accordance with their extant views, and their evaluation of capital punishment remained unchanged. “Those who’d started out pro-capital punishment,” writes Elizabeth Kolbert, “were now even more in favor it; those who’d opposed it were even more hostile.”[31]
Father-daughter duo Jack and Sara Gorman, a psychiatrist and public-health specialist, respectively, “cit[ed] research suggesting that people experience genuine pleasure—a rush of dopamine—when processing information that supports their beliefs.”[32] And we do not seem amenable to accepting more “accurate information” that runs counter to our beliefs.[33]
In this respect, Malkasian’s work is informative. In Temperance, Minerva’s walled off society subsists on her lies, closed off from knowledge of the consequences of the burning world Pa has left in his wake and from the knowledge contained in an expansive library hidden away from his destruction. She exists as the sole news source for her people, who have no choice but to indulge in confirmation bias amongst themselves with respect to the exceptionalism of their society; the alternative would seem to be the anarchic collapse of everything they built over decades of ignorance.
The resolution involves Minerva seeking escape from Blessedbowl’s towering walls (and to unveil the truth), and from the library literally breaching the fortress through the efforts of the graphic novel’s silent, observant protagonist: a wooden doll named Temperance. The book does not give us an epilogue about what happens after the vestiges of truth have unceremoniously broken down the Blessedbowlers’ walls. We are presented with an interesting situation: Minerva — the only conduit between her people and the outside world — comes face to face with resources that would allow her to arm herself and others with the truth. Minerva may perhaps find a way to effect a change in the narrative she constructed about their world.
As Kolbert writes on the Gormans’ research: “Appealing to [our] emotions may work better, but doing so is obviously antithetical to the goal of promoting sound science. ‘The challenge that remains,’ they write toward the end of their book, ‘is to figure out how to address the tendencies that lead to false scientific belief.’”[34] Also significant is the very influence of technology on the way our brains can process information.[35]
One lesson from Temperance is that what we tell ourselves, and how we tell it, plays a major role in what we are willing to believe. As social media users, our best defense against fake news may be in simply keeping the information presented to us in context — at least as regulations and concerns over their effect on speech remain debated. We do not need to step away from our phones to be wary of anything purporting itself to be “the truth”; it behooves us to carry a degree of self-awareness in why we are seeing something (e.g., an algorithm operating to present us information in line with our interests[36]) and having an idea of what we are already carrying into a given conversation.
Something that is “viral,” and goes unquestioned among friends and family we are unlikely to be suspicious of, should give us pause. Still, unquestioned distrust of established or reputable media sources is not the solution, either.[37] Conflating genuine reporting of the truth — or the acknowledgment of honest mistakes — with deliberate misinformation risks doing more harm than good.[38] Much as Temperance’s final resolution was the sudden availability of new sources of information to Minerva and Blessedbowl, we have unprecedented access to resources to verify or disprove purported facts.[39] The key is in not getting lost in a sea of “too much information” — or at least, too much information that plays into our preexisting biases.
Nicole Prieto is a 2018 J.D. candidate and editor-in-chief of Juris Magazine. She is the president of the Duquesne Intellectual Property Law Association, an executive articles editor for the Duquesne Law Review, and an arts and entertainment writer for The Duquesne Duke.
Sources
[1] See generally Cathy Malkasian, Temperance (2010).
[2] This is something Malkasian has also broached with her 2017 work Eartha. See generally Nicole Prieto, ‘Earth’ impresses with modern messages, gorgeous scenery, The Duq. Duke (originally published in print Apr. 20, 2017), www.duqsm.com/eartha-impresses-modern-messages-gorgeous-scenery/ (“[Protagonist Eartha] learns that an army of plaid-wearing gangsters is exploiting [ ] City people into trading valuables for boxes of biscuits printed with dubious headlines. The City people, obsessed with ‘Biscuit News,’ become paralyzed with terror and addicted to comfort-eating the very pastries causing their distress.”).
[3] Stewart Bishop, US Unveils New Russia Sanctions For Cyberattacks, Law360 (Mar. 15, 2018), https://www.law360.com/articles/1022574/us-unveils-new-russia-sanctions-for-cyberattacks.
[4] Perhaps no less amplified by the controversial Sinclair Broadcast Group’s speech read by its anchors earlier this year, criticized for conflating reporting by established media groups with fake news. See Cynthia Littleton, Sinclair Responds to Promo Critics, Says Fake News Warnings ‘Serve No Political Agenda’, Variety (Apr. 2, 2018, 3:49 PM PT), https://variety.com/2018/tv/news/sinclair-local-news-promos-fake-news-statement-1202741905/; Brian Stelter, Sinclair’s new media-bashing promos rankle local anchors, CNN (Mar. 7, 2018, 9:44 PM ET), http://money.cnn.com/2018/03/07/media/sinclair-broadcasting-promos-media-bashing/index.html.
[5] See, e.g., David Z. Morris, How Russians Used Social Media to Boost the Trump Campaign, According to Robert Mueller’s Indictment, Fortune (Feb. 17, 2018), http://fortune.com/2018/02/17/how-russians-used-social-media-election/.
[6] Media, n.2, OED Online, Oxford University Press (Jan. 2018), http://www.oed.com/view/Entry/115635?isAdvanced=false&result=2&rskey=JeZDd1& (last visited Mar. 24, 2018).
[7] Social, adj. and n., OED Online, Oxford University Press (Jan. 2018), http://www.oed.com/view/Entry/183739?redirectedFrom=social+media#eid272386371 (last visited Mar. 25, 2018).
[8] Andrew Marantz, Reddit and the Struggle to Detoxify the Internet, The New Yorker (Mar. 19, 2018), https://www.newyorker.com/magazine/2018/03/19/reddit-and-the-struggle-to-detoxify-the-internet.
[9] See What is Trending?, Help Center, Facebook, https://www.facebook.com/help/1401671260054622 (last visited Mar. 25, 2018).
[10] Natasha Lomas, Fake news is an existential crisis for social media, TechCrunch (Feb. 18, 2018), https://techcrunch.com/2018/02/18/fake-news-is-an-existential-crisis-for-social-media/.
[11] Kevin Granville, Facebook and Cambridge Analytica: What You Need to Know as Fallout Widens, N.Y. Times (Mar. 19, 2018), https://www.nytimes.com/2018/03/19/technology/facebook-cambridge-analytica-explained.html.
[12] John Herrman, YouTube May Add to the Burdens of Humble Wikipedia, N.Y. Times (Mar. 19, 2018), https://www.nytimes.com/2018/03/19/business/media/youtube-wikipedia.html.
[13] Herman, supra note 13.
[14] See, e.g., Katla McGlynn, The Funniest Acts Of Wikipedia Vandalism Ever, Huffington Post (Jan. 16, 2011, 10:41 AM ET), https://www.huffingtonpost.com/2010/04/06/the-funniest-acts-of-wiki_n_522077.html (last updated Apr. 24, 2014).
[15] See U.S. Const. amend. I; see also First Amendment, Legal Information Institute, Cornell Law School, https://www.law.cornell.edu/wex/first_amendment (last visited Mar. 25, 2018).
[16] See also Nick Frost, How ‘Hamilton’ Cast’s Message to Mike Pence Could Have Faced Punishment Despite First Amendment, Juris Magazine (Dec. 3, 2016), http://sites.law.duq.edu/juris/2016/12/03/how-hamilton-casts-message-to-mike-pence-could-have-faced-punishment-despite-first-amendment/.
[17] Marantz, supra note 9.
[18] Id.
[19] Id.
[20] Id.
[21] See, e.g., David O. Klein & Joshua R. Wueller, Fake News: A Legal Perspective, Klein Moynihan Turco LLP (May 1, 2017), http://www.kleinmoynihan.com/fake-news-a-legal-perspective/; NPR Staff, What Legal Recourse Do Victims Of Fake News Stories Have?, NPR (Dec. 7, 2016, 7:04 PM ET), https://www.npr.org/2016/12/07/504723649/what-legal-recourse-do-victims-of-fake-news-stories-have.
[22] Megan Molteni, When WhatsApp’s Fake News Problem Threatens Public Health, Wired (Mar. 9, 2018, 3:14 PM), https://www.wired.com/story/when-whatsapps-fake-news-problem-threatens-public-health/.
[23] Id.
[24] Joshua A. Geltzer, Bad Actors Are Using Social Media Exactly As Designed, Wired (Mar. 11, 2018, 8:00 AM), https://www.wired.com/story/bad-actors-are-using-social-media-exactly-as-designed/.
[25] Natasha Lomas, Fake news is an existential crisis for social media, TechCrunch (Feb. 18, 2018), https://techcrunch.com/2018/02/18/fake-news-is-an-existential-crisis-for-social-media/.
[26] Allison Grande, EU Data Authority Pushes For Tighter ‘Fake News’ Regulation, Law360 (Mar. 21, 2018, 9:21 PM EDT), https://www.law360.com/articles/1024246/eu-data-authority-pushes-for-tighter-fake-news-regulation.
[27] Joshua A. Geltzer, Bad Actors Are Using Social Media Exactly As Designed, Wired (Mar. 11, 2018, 8:00 AM), https://www.wired.com/story/bad-actors-are-using-social-media-exactly-as-designed/.
[28] Grande, supra note 27.
[29] See, e.g., Brien Posey, 10 tips for spotting a fishy email, TechRepublic (Oct. 15, 2015, 11:40 AM PST), https://www.techrepublic.com/blog/10-things/10-tips-for-spotting-a-phishing-email/.
[30] Elizabeth Kolbert, Why Facts Don’t Change Our Minds, The New Yorker (Feb. 27, 2017), https://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds.
[31] Id.
[32] Id.
[33] Id.
[34] Id.
[35] See generally Nicholas Carr, The Shallows: What the Internet is Doing to Our Brains (2010).
[36] AJ Agrawal, What Do Social Media Algorithms Mean For You?, Forbes (Apr. 20, 2016, 6:22PM), https://www.forbes.com/sites/ajagrawal/2016/04/20/what-do-social-media-algorithms-mean-for-you/#2389545ca515.
[37] Truth and trust take centerstage of Duquesne School of Law’s recent symposium “Resurrecting Truth in American Law and Public Discourse: Shall These Bones Live?” See generally Bruce Ledewitz, The Resurrection of Trust in American Law and Public Discourse, Juris Magazine (Nov. 21, 2017), http://sites.law.duq.edu/juris/2017/11/21/the-resurrection-of-trust-in-american-law-and-public-discourse/.
[38] See generally Daniel Funke, India issued fake news guidelines to the press. Then it reversed them., Poynter (Apr. 10, 2018), http://amp.poynter.org/news/india-issued-fake-news-guidelines-press-then-it-reversed-them.
[39] Eugene Kiely & Lori Robertson, How to Spot Fake News, FactCheck.org (Nov. 18, 2016), https://www.factcheck.org/2016/11/how-to-spot-fake-news/.