Two Unsexy Words That Could Save Future Elections, Jonathan Zittrain, 2014
The following is an article published in the New Republic (Volume 244, Issue 29) in June of 2014.
1. ON NOVEMBER 2, 2010, FACEBOOK'S American users were subject to an
2. ambitious experiment in civic-engineering: Could a social network get
3. otherwise-indolent people to cast a ballot in that day's congressional
4. midterm elections?
5. The answer was yes.
6. The prod to nudge bystanders to the voting booths was simple. It consisted
7. of a graphic containing a link for looking up polling places, a button to click to
8. announce that you had voted, and the profile photos of up to six Facebook
9. friends who had indicated they'd already done the same. With Facebook's
10. cooperation, the political scientists who dreamed up the study planted that
11. graphic in the newsfeeds of tens of millions of users. (Other groups of
12. Facebook users were shown a generic get-out-the-vote message or received
13. no voting reminder at all.) Then, in an awesome feat of data-crunching, the
14. researchers cross-referenced their subjects' names with the day's actual
15. voting records from precincts across the country to measure how much their
16. voting prompt increased turnout.
17. Overall, users notified of their friends' voting were 0.39 percent more likely
18. to vote than those in the control group, and any resulting decisions to cast
19. a ballot also appeared to ripple to the behavior of close Facebook friends,
20. even if those people hadn't received the original message. That small
21. increase in turnout rates amounted to a lot of new votes. The researchers
22. concluded that their Facebook graphic directly mobilized 60,000 voters,
23. and, thanks to the ripple effect, ultimately caused an additional 340,000
24. votes to be cast that day. As they point out, George W. Bush won Florida,
25. and thus the presidency, by 537 votes -- fewer than 0.01 percent of the votes
26. cast in that state.
27. Now consider a hypothetical, hotly contested future election. Suppose
28. that Mark Zuckerberg personally favors whichever candidate you don't like.
29. He arranges for a voting prompt to appear within the newsfeeds of tens of
30. millions of active Facebook users -- but unlike in the 2010 experiment, the
31. group that will not receive the message is not chosen at random. Rather,
32. Zuckerberg makes use of the fact that Facebook "likes" can predict political
33. views and party affiliation, even beyond the many users who proudly
34. advertise those affiliations directly. With that knowledge, our hypothetical
35. Zuck chooses not to spice the feeds of users unsympathetic to his views.
36. Such machinations then flip the outcome of our hypothetical election.
37. Should the law constrain this kind of behavior?
38. The scenario imagined above is an example of digital gerrymandering.
39. All sorts of factors contribute to what Facebook or Twitter present in a
40. feed, or what Google or Bing show us in search results. Our expectation
41. is that those intermediaries will provide open conduits to others' content
42. and that the variables in their processes just help yield the information we
43. find most relevant. (In that spirit, we expect that advertiser-sponsored links
44. and posts will be clearly labeled so as to make them easy to distinguish
45. from the regular ones.) Digital gerrymandering occurs when a site instead
46. distributes information in a manner that serves its own ideological agenda.
47. This is possible on any service that personalizes what users see or the
48. order in which they see it, and it's increasingly easy to effect.
49. There are plenty of reasons to regard digital gerrymandering as such a
50. toxic exercise that no right-thinking company would attempt it. But none
51. of these businesses actually promises neutrality in its proprietary algorithms,
52. whatever that would mean in practical terms. And they have already shown
53. themselves willing to leverage their awesome platforms to attempt to
54. influence policy. In January 2012, for example, Google blacked out its home
55. page "doodle" as a protest against the pending Stop Online Piracy Act
56. (SOPA), said by its opponents (myself among them) to facilitate censorship.
57. The altered logo linked to an official blog entry importuning Google users to
58. petition Congress; SOPA was ultimately tabled, just as Google and many
59. others had wanted. A social-media or search company looking to take the
60. next step and attempt to create a favorable outcome in an election would
61. certainly have the means.
62. So what's stopping that from happening? The most important fail-safe is the
63. threat that a significant number of users, outraged by a betrayal of trust, would
64. adopt alternative services, hurting the responsible company's revenue and
65. reputation. But while a propagandistic Google doodle or similarly ideological
66. alteration to a common home page lies in plain view, newsfeeds and search
67. results have no baseline. They can be subtly tweaked without hazarding the
68. same backlash. Indeed, in our get-out-the-vote hypothetical, the people with
69. the most cause for complaint are those who won't be fed the prompt and
70. may never know it existed. Not only that, but the disclosure policies of social
71. networks and search engines already state that the companies reserve the
72. right to season their newsfeeds and search results however they like. An
73. effort to sway turnout could be construed as being covered by the existing
74. agreements and require no special notice to users.
75. At the same time, passing new laws to prevent digital gerrymandering would
76. be ill advised. People may be due the benefits of a democratic electoral
77. process. But in the United States, content curators appropriately have a
78. First Amendment right to present their content as they see fit. Meddling
79. with how a company gives information to its users, especially when no one's
80. arguing that the information in question is false, is asking for trouble. (That's
81. one reason why the European Court of Justice got it wrong when it opened
82. the door to people censoring the search-engine results for their names,
83. validating a so-called "right to be forgotten.")
84. There's a better solution available: enticing Web companies entrusted with
85. personal data and preferences to act as "information fiduciaries." Champions
86. of the concept include Jack Balkin of Yale Law School, who sees a precedent
87. in the way that lawyers and doctors obtain sensitive information about their
88. clients and patients -- and are then not allowed to use that knowledge for
89. outside purposes. Balkin asks, "Should we treat certain online businesses,
90. because of their importance to people's lives, and the degree of trust and
91. confidence that people inevitably must place in these businesses, in the
92. same way?"
93. As things stand, Web companies are simply bound to follow their own
94. privacy policies, however flimsy. Information fiduciaries would have to do
95. more. For example, they might be required to keep automatic audit trails
96. reflecting when the personal data of their users is shared with another
97. company, or is used in a new way. (Interestingly, the kind of ledger
98. that crypto-currencies like Bitcoin use to track the movement of money
99. could be adapted to this function.) They would provide a way for users to
100. toggle search results or newsfeeds to see how that content would appear
101. without the influence of reams of personal data -- that is, non-personalized.
102. And, most important, information fiduciaries would forswear any formulas of
103. personalization derived from their own ideological goals. Such a system could
104. be voluntary, in the way that businesspeople who make suggestions on
105. buying and selling stocks and bonds can elect between careers as
106. investment advisers or brokers: the "advisers" owe duties not to put their
107. own interests above those of their clients, while the "brokers" have no such
108. duty, even as they -- confusingly -- can go by such titles as financial
109. adviser, financial consultant, wealth manager, and registered representative.
110. (If someone's telling you how to handle your nest egg, you might ask flat
111. out whether he or she is your fiduciary and walk swiftly to the exit if the
112. answer is no.)
113. Constructed correctly, the duties of the information fiduciary would be limited
114. enough for the Facebooks and Googles of the world, while meaningful
115. enough to the people who rely on the services, that the intermediaries could
116. be induced to opt into them. To provide further incentive, the government
117. could offer tax breaks or certain legal immunities for those willing
118. to step up toward an enhanced duty to their users. My search results and
119. newsfeed might still end up different from yours based on our political
120. leanings, but only because the algorithm is trying to give me what I
121. want -- the way that an investment adviser may recommend stocks
122. to the reckless and bonds to the sedate -- and never because the search
123. engine or social network is trying to covertly pick election winners.
124. Four decades ago, another emerging technology had Americans worried
125. about how it might be manipulating them. In 1974, amid a panic over the
126. possibility of subliminal messages in TV advertisements, the Federal
127. Communications Commission strictly forbade that kind of communication.
128. There was a foundation for the move; historically, broadcasters have
129. accepted a burden of evenhandedness in exchange for licenses to use
130. the public airwaves. The same duty of audience protection ought to be
131. brought to today's dominant medium. As more and more of what shapes
132. our views and behaviors comes from inscrutable, artificial-intelligence-driven
133. processes, the worst-case scenarios should be placed off limits
134. in ways that don't trip over into restrictions on free speech. Our information
135. intermediaries can keep their sauces secret, inevitably advantaging some
136. sources of content and disadvantaging others, while still agreeing that some
137. ingredients are poison -- and must be off the table.
What is "toxic" (line 50) about digital gerrymandering?