Those who would give up essential liberty, to purchase a little temporary safety, deserve neither liberty nor safety. —Benjamin Franklin
In February 2004, nineteen-year-old Mark Zuckerberg launched the Harvard-only social network TheFacebook from his dorm room. A couple of days later he thought, “You know, someone needs to build a service like this for the world,” and with the help of friends, he quickly scaled it to include other colleges. Initially, he was hoping for four- to five-hundred users, but within a few months the site had a hundred thousand. He soon dropped out of Harvard and moved to Silicon Valley to grow the project, becoming the world’s youngest self-made billionaire by the age of twenty-three. “It’s not because of the amount of money,” he told an interviewer around that time. “For me and my colleagues, the most important thing is that we create an open information flow for people.”1
And Facebook has done so for billions the world over, helping businesses find new customers and talent, connecting far-off soldiers with loved ones back home, bringing toddlers’ first steps and words to distant grandparents, and reuniting friends after decades of silence.
When, in 2004, Zuckerberg asked “who knows where we’re going next?,” he couldn’t have imagined that, seven years later, during the Arab Spring, Facebook would carry “a cascade of messages about freedom and democracy across North Africa and the Middle East,” as one researcher put it.2 No one could have foreseen that in 2012, after Hurricane Sandy devastated the New Jersey coast, residents of the Garden State would use Facebook to persuade local legends Bruce Springsteen and Jon Bon Jovi to play a benefit concert. Or that in 2013, when jihadists bombed the Boston Marathon, those desperate for news from friends and family would turn to Facebook. Or that in 2014, the ALS Association would use Facebook to popularize its “ice-bucket challenge,” raising more than $220 million to fight amyotrophic lateral sclerosis.
But the “open information flow” facilitated by Facebook and other platforms has also raised an important debate about the “social responsibility” of social-media companies. That’s because, despite worries about “echo chambers,” platforms have connected people with, among others, their enemies: conservatives with “Progressives,” Christian fundamentalists with atheists, capitalists with statists, neo-Nazis with Jews. And instead of facilitating constructive dialogue, social media often seems to bring out the worst in people. It’s never been easier to hurl nasty thoughts out into the world without reflection, and doing so gives users a jolt of feedback that many find rewarding. Such platforms also provide massive audiences to those who prioritize their agendas over the truth, making them powerful tools for spreading falsehoods and lies. And given that people all over the world now rely on social media for news, some argue that these companies should do more to moderate content—while others say that content moderation is “censorship” and that they ought to back off.
Many are calling on government to enforce these views, and they suggest that the future of civil society depends on massive new regulations on social-media companies. In a recent speech delivered to the Anti-Defamation League, for instance, English actor Sacha Baron Cohen eloquently summarized the case for government intervention. He argued that tech companies use algorithms that “amplify the type of content that keeps users engaged—stories that appeal to our baser instincts and that trigger outrage and fear.” He concludes that “by now it’s pretty clear, they cannot be trusted to regulate themselves,” so “as with the Industrial Revolution, it’s time for regulation and legislation to curb the greed of these hi-tech robber barons.”3
It’s no doubt true that some social-media companies (or particular individuals at these companies) have done shady, even despicable things, and some antipathy toward them is certainly reasonable. Facebook, for one, has at times been careless with user data, most notably in what became the Cambridge-Analytica scandal. But much of today’s anger toward social-media companies is misplaced.
Consider the response to a massive 2018 study that reported that on Twitter, “Falsehood diffused significantly farther, faster, deeper, and more broadly than the truth in all categories of information, and the effects were more pronounced for false political news.”4 Some of social media’s detractors claimed that the study is evidence that, as Cohen charges, platforms use algorithms that amplify divisive content to keep users engaged. This turned out to be little more than an eloquent demonstration of what the scientists who conducted the study actually concluded: “false news spreads more than the truth because humans, not robots [including Twitter’s algorithms], are more likely to spread it.” So, although people were quick to blame Twitter, the spread of false news “might have something to do with human nature,” according to the study’s lead scientist, Soroush Vosoughi of MIT.5 (“Falsehood flies, and the Truth comes limping after it,” said Jonathan Swift more than three hundred years ago. Of course people have free will and the choice to focus on truth and accuracy. But free will goes both ways, and many choose to advance false ideas, or at least ideas they don’t know to be true.)
The reaction to this study illustrates a broad trend. By and large, those who rage against social-media companies disregard the incredible values these platforms provide, and they aim blame at the wrong people. Worse, though, many of the solutions proposed are far more dangerous than the problems they purport to solve.
Let’s examine some of the most impassioned charges against social-media companies. . . .
Click To Tweet
You might also like
1. “Mark Zuckerberg,” Wikipedia, https://en.wikipedia.org/wiki/Mark_Zuckerberg (accessed December 10, 2019).
2. Catherine O’Donnell, “New Study Quantifies Use of Social Media in Arab Spring,” University of Washington, September 12, 2011, https://www.washington.edu/news/2011/09/12/new-study-quantifies-use-of-social-media-in-arab-spring/.
3. Sacha Baron Cohen, “Read Sacha Baron Cohen’s Scathing Attack on Facebook in Full: ‘Greatest Propaganda Machine in History,’” Guardian, November 22, 2019, https://www.theguardian.com/technology/2019/nov/22/sacha-baron-cohen-facebook-propaganda.
4. Soroush Vosoughi, Deb Roy, and Sinan Aral, “The Spread of True and False News Online,” Science 359, no. 6380 (March 9, 2018), https://science.sciencemag.org/content/359/6380/1146.
5. Robinson Meyer, “The Grim Conclusions of the Largest-Ever Study of Fake News,” Atlantic, March 8, 2018, https://www.theatlantic.com/technology/archive/2018/03/largest-study-ever-fake-news-mit-twitter/555104/.
6. Popular leader Aung San Suu Kyi was held under house arrest and barred from the race.
7. Elise Thomas, “Facebook Keeps Failing in Myanmar,” ForeignPolicy.com, June 21, 2019, https://foreignpolicy.com/2019/06/21/facebook-keeps-failing-in-myanmar-zuckerberg-arakan-army-rakhine/.
8. Steve Stecklow, “Inside Facebook’s Myanmar Operation: Hatebook,” Reuters, August 15, 2018, https://www.reuters.com/investigates/special-report/myanmar-facebook-hate/.
9. Paul Mozur, “A Genocide Incited on Facebook, with Posts from Myanmar’s Military,” New York Times, October 15, 2018, https://www.nytimes.com/2018/10/15/technology/myanmar-facebook-genocide.html.
10. “Myanmar Rohingya: What You Need to Know about the Crisis,” BBC, April 24, 2018, https://www.bbc.com/news/world-asia-41566561; D. Parvaz, “One Year and 10,000 Rohingya Deaths Later, UN Accuses Myanmar of ‘Genocide,’” ThinkProgress, August 27, 2018, https://thinkprogress.org/10000-rohingya-deaths-united-nations-accuses-myanmar-genocide-f19d785bbece/.
11. Cohen, “Read Sacha Baron Cohen’s Scathing Attack on Facebook in Full.”
12. The U.S. Ninth Circuit Court describes true threats as threats that a “reasonable person” would interpret as “expression[s] of intent to inflict bodily harm upon” the person threatened. See “Threats of Violence against Individuals,” Legal Information Institute, Cornell Law School, https://www.law.cornell.edu/constitution-conan/amendment-1/threats-of-violence-against-individuals (accessed January 14, 2020).
13. “Facts about Content Review on Facebook,” Facebook, December 28, 2019, https://about.fb.com/news/2018/12/content-review-facts/.
14. Robert S. Mueller III, “Report on the Investigation into Russian Interference in the 2016 Presidential Election,” vol. 1, U.S. Department of Justice, March 2019, https://www.justice.gov/storage/report.pdf.
15. Scott Shane, “These Are the Ads Russia Bought on Facebook in 2016,” New York Times, November 1, 2017, https://www.nytimes.com/2017/11/01/us/politics/russia-2016-election-facebook.html.
16. “Get Authorized to Run Ads about Social Issues, Elections or Politics,” Facebook, https://www.facebook.com/business/help/208949576550051?id=288762101909005 (accessed January 15, 2020).
17. In fact, during the same period in which Facebook is accused of “allow[ing a] foreign power to interfere in our elections,” Zuckerberg ousted Facebook executive Palmer Luckey, a libertarian, after he donated $10,000 to a pro-Trump organization. Although the company denies Luckey was fired for his political views, a crime in the state of California, the evidence indicates that he was: While Luckey’s future at Facebook was still in question, Zuckerberg personally drafted a letter that Luckey was asked to publish as his own, which distanced him from Trump and avowed support for Gary Johnson. See Kirsten Grind and Keach Hagey, “Why Did Facebook Fire a Top Executive? Hint: It Had Something to Do with Trump,” Wall Street Journal, November 11, 2018, https://www.wsj.com/articles/why-did-facebook-fire-a-top-executive-hint-it-had-something-to-do-with-trump-1541965245.
18. Mark Zuckerberg, “A Blueprint for Content Governance and Enforcement,” Facebook, November 15, 2018, https://www.facebook.com/notes/mark-zuckerberg/a-blueprint-for-content-governance-and-enforcement/10156443129621634/.
19. See locals.com for more information.
20. Justin Amash, Twitter, June 19, 2019, https://twitter.com/justinamash/status/1141513644758437888.
21. See “Chapter Seven, Section 130: Incitement to Hatred,” German Criminal Code, translated by Dr. Michael Bohlander, https://www.gesetze-im-internet.de/englisch_stgb/englisch_stgb.html#p1246 (accessed December 11, 2019).
22. James Waterworth et. al., “Germany’s Draft Network Enforcement Law Is a Threat to Freedom of Expression, Established EU Law and the Goals of the Commission’s DSM Strategy—the Commission Must Take Action,” May 22, 2017, https://edri.org/files/201705-letter-germany-network-enforcement-law.pdf.
23. “2015–16 New Year’s Eve Sexual Assaults in Germany,” Wikipedia, https://en.wikipedia.org/wiki/2015%E2%80%9316_New_Year%27s_Eve_sexual_assaults_in_Germany#Suspects (accessed December 10, 2019).
24. “Germany: Flawed Social Media Law,” Human Rights Watch, February 14, 2018, https://www.hrw.org/news/2018/02/14/germany-flawed-social-media-law.
25. Andrea Shalal, “German States Want Social Media Law Tightened,” Reuters, November 12, 2018, https://www.reuters.com/article/us-germany-hate-speech/german-states-want-social-media-law-tightened-media-idUSKCN1NH2HW.
26. Kaye writes, “While it is recognized that business enterprises also have a responsibility to respect human rights, censorship measures should not be delegated to private entities (A/HRC/17/31). States should not require the private sector to take steps that unnecessarily or disproportionately interfere with freedom of expression, whether through laws, policies or extralegal means (A/HRC/32/38). See David Kaye, “Mandate of the Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression,” Office of the High Commissioner for Human Rights, June 1, 2017, https://www.ohchr.org/Documents/Issues/Opinion/Legislation/OL-DEU-1-2017.pdf.
27. Sajid Javid and Jeremy Wright, “A Summary—Online Harms White Paper,” Department for Digital, Culture, Media & Sport, April 8, 2019, https://www.gov.uk/government/consultations/online-harms-white-paper.
28. Amy Gunia, “U.K. Authorities Propose Making Social Media Executives Personally Responsible for Harmful Content,” Time, April 8, 2019, https://time.com/5565843/united-kingdom-social-media-regulations/.
29. Tim Simonite, “France Plans a Revolution to Rein in the Kings of Big Tech,” Wired, December 8, 2019, https://www.wired.com/story/france-plans-revolution-rein-kings-tech/.
30. Lydia Saad, “Americans Split on More Regulation of Big Tech,” Gallup, August 21, 2019, https://news.gallup.com/poll/265799/americans-split-regulation-big-tech.aspx.
31. Tom Wheeler, “Should Big Technology Companies Break up or Break Open?,” Brookings, April 11, 2019, https://www.brookings.edu/blog/techtank/2019/04/11/should-big-technology-companies-break-up-or-break-open/.
32. Philip Napoli, “What Would Facebook Regulation Look Like? Start with the FCC,” Wired, October 4, 2019, https://www.wired.com/story/what-would-facebook-regulation-look-like-start-with-the-fcc/.
33. Gene Kimmelman, “The Right Way to Regulate Digital Platforms,” Harvard Kennedy School, Shorenstein Center on Media, Politics, and Public Policy, September 8, 2019, https://shorensteincenter.org/the-right-way-to-regulate-digital-platforms/.
34. Megan Phelps-Roper, “I Grew up in the Westboro Baptist Church. Here’s Why I Left,” TED, February 2017, https://www.ted.com/talks/megan_phelps_roper_i_grew_up_in_the_westboro_baptist_church_here_s_why_i_left#t-21989.
35. Megan Phelps-Roper, “Sarah Sits down with an Ex-Member of the Westboro Baptist Church,” I Love You, America, YouTube, October 25, 2017, https://www.youtube.com/watch?v=EmgZgHpv8Zs.
36. Megan Phelps-Roper, Twitter, https://twitter.com/meganphelps/status/666260922915246080 (accessed January 25, 2020).
[bctt tweet="Social media at times may carry messages we abhor. Nevertheless, if we want to safeguard the future of civil society, then we must defend the rights of @Facebook @Twitter @Google @YouTube and all other social media companies."]