open Secondary menu

The Impact of Platforms in Elections and Potential Regulatory SolutionsDiscussion Paper 2: The Impact of Social Media Platforms in Elections

Over the last several years, a public discourse has emerged related to the effects of the new digital information sphere and its use (and abuse) by individuals, political actors and those who would seek to interfere with democratic electoral processes the world over.

Drawing on Elections Canada's observations of the 2019 general election,54 as well as our review of the legislation and understanding of the advice of experts and evidence from other jurisdictions, the following sections consider these effects in terms of their impact on three critical components of the electoral ecosystem: transparency of messaging; access to reliable information related to the electoral process; and trust in the electoral process. To facilitate discussion, we provide examples of potential avenues for regulation throughout the text.

Transparency

Transparency of the electoral process promotes the accountability of political actors. This happens in two ways:

  1. Transparency permits regulatory and enforcement bodies to determine whether people are breaking the law and to hold them accountable by using the available compliance and enforcement tools.
  2. Transparency also allows citizens to hold political actors accountable in the court of public opinion and at the ballot box.55

Transparency of the source of advertising

The CEA requirements for taglines and ad registries both provide information to electors about who is responsible for ads they may see on platforms. It should be noted, however, that these transparency requirements apply only to advertising according to the CEA definition.

In the context of the Internet, the differentiation between what constitutes advertising—and is thus subject to the tagline and registry requirements—and what does not is set out in Elections Canada Interpretation Note 2015-04, "Election Advertising on the Internet."56 In short, where a message on the Internet has—or would normally have—a placement cost, it constitutes advertising, but where there is no placement cost (such as when a user uploads a YouTube video for free to their channel), it does not constitute advertising.57

The important point is that the current transparency requirements in the CEA apply only to a subset of the messages placed on the Internet, and this only during the election or pre-election period.58

Transparency of the content of advertising

In addition to transparency in terms of who is responsible for advertising, the CEA contains certain requirements having to do with the transparency of the ad content.

At the time when most public messages took the form of television and radio ads broadcast widely, political ads tended to be seen by larger audiences. Now, platforms allow political actors to microtarget advertising within and across a number of platforms, meaning that some of their messages are seen by very small audiences. The new CEA provision requiring the creation of ad registries was introduced to address this new reality.

Even with the various ad registries created for the 2019 general election, however, the public had no easy way to search across registries to see all the ads run by a given political actor across all platforms, or to see all the ads targeted to a particular audience. Individuals and civil society organizations seeking to review all the ads run by an advertiser are required to find and look through each platform's ad registry—including platforms they may never have heard of. Further, the public still has no practical way to see all the non-advertising content that political entities distribute through various platforms: organic messages that do not cost money to produce (e.g. a Facebook post on a candidate's page), or messages that do cost money to produce but do not meet the CEA definition of advertising (e.g. a YouTube video).

Experts have proposed stricter requirements for ad registries, such as having to offer data that is easily searchable and machine readable.59 Some experts propose that parties themselves should maintain registries of their ads on all platforms, or indeed of all their messaging.60 Others argue for the creation of a centralized database for all political ads from all sources.61

Transparency of ad targeting criteria

The CEA obligation for ad registries does not include a requirement for platforms to disclose advertisers' targeting criteria. Mandatory disclosure of this information, which some experts have argued for,62 could improve citizens' understanding of the way parties approach the electorate and how they promote themselves to people with different perceived interests.

Transparency about how platforms moderate and curate content

Platforms' practices related to moderation and curation remain largely unknown to users. The fact that each user has a unique personalized timeline and that much organic content cannot be found by searching the respective platform means it is hard to get a handle on trends and anomalies in the content users see, or to measure its effects.

Platforms regularly shut down inauthentic accounts and remove content that does not meet policies; in some cases, they announce takedowns of major information operations.63 Yet because platforms do not give detailed information about moderation or curation decisions, it is difficult to know what decisions platforms are making, the basis for those decisions, and whether platforms are applying their rules and policies fairly and consistently.

In considering the impact of platforms on the access to reliable information, many experts point to recommendation algorithms as a place to bring more scrutiny.64 One study conducted during the 2019 general election found that after a user clicked on a single post critical of a candidate, one platform's algorithm delivered dozens of posts in the same vein, filled with disinformation, conspiracy theories and inflammatory memes.65 Experts have proposed various measures to increase platforms' accountability for algorithms. Some, such as the Algorithmic Accountability Act introduced in the United States Congress, propose fostering government oversight of algorithms, such as reviews of training data (data the algorithm "learns" from), design bias and discriminatory outcomes.66

The challenge of achieving meaningful transparency

Some experts point to challenges in achieving enough transparency to deliver democratic outcomes. In the 2016 US presidential election, for example, the Trump campaign ran 50,000 to 60,000 concurrent ads each day,67 making public scrutiny practically impossible. In Canada, spending limits68 reduce the number of ads, but there are likely still too many to allow for meaningful scrutiny: a study of all political and partisan advertising in the ad registry of a single platform, Facebook, during the 2019 general election period found 44,725 ads.69 The notion that transparency leads to accountability presumes that civil society has the capacity to review and critically evaluate tens or hundreds of thousands of ads.

This challenge may suggest that transparency alone is not enough to preserve the objectives of the CEA in this area. As well as challenges created by the sheer volume of content, there are challenges related to its quality. These are discussed in the next section.

Questions to consider:

  • What changes, if any, should be made to the CEA's existing ad registry requirements?
    • Should the registries be expanded to include content that is not an ad, such as organic posts?
    • Should registries be required to provide other kinds of metadata beyond who posted the ad, such as its cost and/or targeting criteria?
    • Who should be responsible for maintaining them? Why?
  • What regulation, if any, should there be around the targeting of political ads?
  • Should the use of algorithms in data-driven digital advertising be regulated? If so, how?
Civic literacy and fact checking interventions

Many governments, civil society groups and social media platforms have launched digital, news or civic literacy programs, some with a fact-checking component.

Platforms regularly announce or adapt their fact-checking initiatives and policies in response to particular events such as elections or the COVID-19 pandemic. At the time of writing, one of many notable initiatives is Facebook's "fact-checking program," which involves collaboration with contracted fact checkers. In some cases, this reduces the reach of some types of content deemed inaccurate and/or displaying accurate content on topics that are often presented inaccurately.70 Twitter has considered labelling harmful misinformation by public figures with clear warning labels, while Reddit relies on volunteer moderators to fact check content.71

In addition, many governments fund research and civic journalism to address the "substantial decline in civic and accountability journalism ... [and the undermining of] the century-old business model of newsroom journalism"72 attributed to the rise of content-aggregating and digital advertising platforms. Canada's federal government has funded digital, news and civic literacy programming.73

While literacy and fact-checking initiatives go some distance toward addressing the threat of disinformation and misinformation online, critics of these initiatives have noted that such efforts may not be enough. Critics argue that those who would benefit most from civic literacy lessons are least likely to receive them.74 Likewise, many users who are exposed to inaccurate information are unlikely to see the relevant fact-checked version. Further, such individual interventions do not address structural aspects that fundamentally affect what information users see and their ability to assess its veracity.75 Lastly, platforms' fact-checking efforts rely on volunteers and partnerships with third parties, some of whose commitment to correct information has been called into question. For example, one platform faced criticism for granting fact-checker status to a site accused of spreading slanted and inaccurate information.76

Access to reliable information related to the electoral process

To participate meaningfully in the electoral process, voters need access to reliable information. They need to be able to access accurate information about when, where and ways to register and vote, as well as reliable information about their political options so they can meaningfully weigh their decision and cast their vote.

Reliable information about voting

To be able to exercise their democratic right to vote, voters need at a minimum to know how to do so. This requires accessing accurate and reliable information about when, where and ways to register and vote. Part of Elections Canada's mission is to ensure that Canadians can exercise their democratic right to vote; having accurate information about registration and voting, and ensuring that this information is properly communicated, is at the core of the agency's function.

Malign actors who want to supress votes may spread inaccurate information on the voting process. This sort of information is also sometimes spread by well-meaning people who do not have the intention of suppressing votes.

In contrast to what they can observe in traditional media, outside observers cannot see everything that circulates through social media because of the scale, speed and targeting of individual users that those platforms make possible. This means that false or misleading information about how to vote can spread widely on social media, undetected by entities concerned with ensuring that voters can exercise their democratic right.

During the 2019 general election, Elections Canada monitored closely for inaccurate information on the voting process. We detected few instances, and those we did see seemed to have little reach. Most such content appeared to be posted by Canadians making honest mistakes; a few such posts made by media or candidates were later corrected by their author.

On election day Elections Canada saw, on a few platforms, variations of a post stating that supporters of some parties should vote at a later date. While they may have been intended as a joke, these posts had the potential to suppress votes, particularly among those already facing barriers to voting. The agency flagged the posts to the platforms for their consideration and removal, given that such posts violate platforms' community standards.77

In recent months, in the context of elections in other jurisdictions, notably the United States, many platforms have made further commitments to monitor for and remove inaccurate information about when, where and ways to register and vote.78 We may see continued success in limiting this type of inaccurate information if all platforms monitor proactively for it and remove it.

However, the move toward closed and private social media spaces presents challenges for detecting harmful information spreading online. Similarly, the abundance of languages spoken in Canada presents challenges for monitoring public spaces online, meaning that harmful information could spread widely online without being detected and addressed.

During the 2019 general election and since then, Elections Canada has observed that discourse on social media platforms related to voting is often focused on the perceived integrity of the electoral process itself. Some of this discourse includes false or misleading claims that the election process had been interfered with or otherwise lacks integrity. Unreliable information on the integrity of the electoral process can harm public trust in the process and confidence in the results it delivers. That topic is discussed in more detail in the section on trust below.

Private social media spaces and messaging services

As concerns about privacy and the use of social media platforms have grown, there has been a shift to more private and group-oriented communication online, such as WhatsApp or Facebook Groups, as well as growth in the use of ephemeral (short-lived) communications such as those offered by Instagram Stories and Snapchat.79 Canadians are increasingly sharing information in closed social media groups and in private messaging apps.80

Critics argue that the shift to closed spaces and private communications presents challenges for social media platforms because they may not have the capacity to monitor for content breaking "the social network's rules against hate speech, harassment and other ills" that could spread in these spaces.81 Nor can outside observers have access to these private spaces to know whether policies are being enforced.

Apart from monitoring, platforms could make, and are making, other interventions in private messaging. For instance, in April 2020, WhatsApp introduced new limits on the forwarding of viral messages to reduce the spread of potentially harmful, inaccurate or misleading information in private conversations.82 This action was taken in response to misinformation about the novel coronavirus spreading on the platform. WhatsApp began experimenting with forwarding limits in 2018, after rumours that spread virally on the platform were linked to mob violence in India.83

Reliable information about voters' options

For voters to be able to make a considered voting decision, it is important for them to know the range of political options on offer and to have reliable information they can use to make their choices. For voters to be able to know, as fully as possible, what their options are, political actors need a fair opportunity to make their platforms known and a level playing field on which to compete.

The CEA contains various provisions that support the objective of fairness. Some of these provisions address the uneven impact that can result from the use of money: spending limits and refunds for a portion of some parties' and candidates' election expenses, for example, reduce the potential for political actors with more money to dominate the electoral discourse.

The CEA also includes measures to ensure that electors have the opportunity to hear from a range of candidates and parties. For example, the CEA accords all candidates and parties access to the lists of electors so they can campaign84 and allows candidates access to multi-resident buildings and public places for canvassing.85 It also provides free broadcasting time and requires paid broadcasting time to be provided at low and equitable rates to ensure that all registered political parties have an opportunity to make their platforms known to Canadians.86 These provisions benefit voters by helping to ensure that they know and can evaluate their options so they can make an informed choice.

Malign actors can use social media platforms to impede voters' ability to make a considered choice by spreading false or misleading information about opponents or by using deceptive tactics to suppress voices or divert electors' attention.

Malign actors can misrepresent their opponents and their policies to dissuade voters from supporting them. Elections Canada did not monitor for this type of inaccurate information during the 2019 general election, since the agency's role is not to assess the accuracy of political claims. (We note that political entities tend to monitor their opponents' campaigns and hold one another accountable for honesty and integrity.)

While the CEA has narrow prohibitions on certain types of false statements about candidates and people associated with parties, and on using a pretense or contrivance to influence electors, regulating how political actors describe their opponents can be difficult, as it risks limiting legitimate political expression. That said, the volume and speed at which false claims may spread on social media provide a challenge that did not exist before with traditional media.87

In extreme cases, malign actors use false identities, bots or paid trolls to "game" platforms' organic content and ad algorithms, artificially amplifying or suppressing voices to achieve strategic goals.88 Intelligence agencies, researchers and platforms themselves continue to detect coordinated digital interference in elections by a range of actors, including states.89 While platforms have added authentication requirements for users and political advertisers, malign actors have found ways to circumvent them by hiding behind fabricated or stolen identities or rented accounts.90 Platforms continue to remove billions of fake accounts each year; estimates of fake regular monthly users on popular platforms range from 5% to 20%.91

Harassment of political actors and voters on social media can also impact candidates' and voters' participation in the electoral process and voters' ability to hear from a diversity of candidates. There are indicators of networked online harassment during the 2019 general election, which researchers say can prevent some politicians—disproportionately, women and people of colour—from experiencing an equitable opportunity to inform voters of their policy platforms, as their messages may be drowned out or they may be driven offline or out of a race92 as a result of threats to their safety. To illustrate the vitriolic messages that female politicians receive, a member of Quebec's National Assembly read into the record some messages she had received; they included "kill yourself" and "if I were your son or daughter, I would be ashamed of you."93

In the 2016 US presidential election, malign actors posing as African-Americans worked actively to depress that community's vote through suggestions that voting does not matter and that no candidate had their interests at heart.94 This is an example of a foreign state–sponsored effort to affect or narrow voters' choices by influencing which voting option they choose and why.

Malign actors can also use platforms to deliberately "flood the zone" with dubious and contradictory content: "[g]iven our finite attention, flooding social media with junk is a way to suppress the free exchange of ideas."95 "Flooding the zone" means that voters who are on social media may end up being distracted or diverted away from relevant content that contributes to making informed decisions.96

Research shows that the architecture of the platforms may impede a fair hearing of diverse viewpoints by reinforcing polarization and affecting the information voters consume. Choices made by users and algorithms create a spiral that feeds one-dimensionality and manifests itself in online social networks that are divided along ideological lines.97 Viewpoints can go unchallenged and spread further among users in "filter bubbles"—in which "algorithms inadvertently amplify ideological segregation by automatically recommending content an individual is likely to agree with"98—and "echo chambers," online environments where members are largely exposed only to opinions confirming their own.99

Questions to consider:

  • Should there be regulation to require all digital and social media platforms to delete inaccurate or misleading content on where, when and ways to vote? If so, what sort of regulation?
  • How should Elections Canada work with other stakeholders (platforms, regulated actors, civil society groups, researchers) who may be involved in this field?
  • Should digital and social media platforms have legal obligations to report publicly on any accounts or content they have removed? If so, exactly what information should they be required to report publicly?
  • Should further measures be taken by social media platforms to reduce the spread of potentially harmful inaccurate and/or misleading information in private spaces?
  • What are the risks for elections administrators such as Elections Canada in using digital and social media platforms to reach electors? What mitigating measures could be adopted to manage these risks?

Trust in the electoral process

The final element of the electoral ecosystem affected by digital and social media platforms is trust. Without trust in the electoral process, electors are less likely to cast a ballot, believe the results are true or consider election winners legitimate.100 One of the primary objectives of the CEA is to ensure that Canadians have trust in the results of elections and the process that led to those results. Canadians may not succeed in electing the representatives or government they all want, but it is nonetheless crucially important that they have trust in the process by which those representatives and that government are elected.

The CEA seeks to contribute to this environment of trust by creating an open and public process for elections according to clear and consistently applied rules. A primary role of the Chief Electoral Officer, stemming from the creation of this office a century ago, is to standardize the application of the rules for federal elections across the country, which contributes to reinforcing the trust of Canadians in the electoral process.

The 2019 general election was Canada's "most online" election, with social media platforms offering a means for millions of Canadians to voice their support for or opposition to candidates or parties.101 Inaccurate and misleading content online can reduce citizens' trust in the electoral process.

Canadians generally have a high level of trust in Elections Canada and the electoral process.102 However, recent research shows that Canadians' trust in governments, media, leaders, corporations and even not-for-profit organizations is declining.103 During and since the 2019 general election, Elections Canada observed expressions of distrust in the election process and its outcome. Given that the erosion of trust occurs over time, these early indications are cause for concern.

While scrutiny and criticism of Elections Canada's administration of the election is legitimate and welcome, a distinction must be made between unsupported expressions of distrust on one hand and constructive feedback on the other. The agency benefits from users who contact the agency through social media to deliver feedback on Elections Canada's services. The agency does not consider such feedback to have the potential to negatively affect the public's trust in the electoral process or election results.

That said, unsupported expressions of distrust containing false or misleading information or based on incorrect assumptions have the potential to harm trust in the electoral process. Elections Canada observed such posts on social media falling into three categories:

  • posts deliberately misrepresenting or exaggerating real but isolated incidents
  • posts containing outright fabrications
  • posts seemingly based on confusion about the electoral process

A notable example of posts in the second category was a fabricated story posted on the first day of advance polls, where a user alleged that they had witnessed ballot tampering ("smudging" of pencil marks) and "chaos" at a Toronto-area polling place. In this instance, Elections Canada immediately looked into the matter and determined that the allegations had no basis in fact. In the days that followed, the agency observed the original story and many variants of it spreading to several platforms. Elections Canada staff learned anecdotally that the fabricated story was also being shared on a private messaging app and in person at community gathering places. Later, a journalist covered the false allegation,104 ultimately debunking it but nevertheless demonstrating the wide reach of social media. This incident demonstrated how quickly and easily fabricated information can spread among platforms and to other channels.

In the third category, the agency detected posts alleging that Elections Canada must not have counted votes cast at advance polls, since it did not provide voting results right after advance polls closed; some users evidently did not realize that these ballots are counted and reported on election night.

The posts cited above expressed distrust based on specific concerns noted by users. Elections Canada also saw expressions of distrust that were more vague; in these posts, users made general accusations that the agency is "corrupt" or that the election process is "rigged."

While Elections Canada kept a close eye on these sorts of expressions of distrust featuring false or misleading information about the integrity of the process, the agency did not flag these posts to the social media platforms, as the posts did not violate the platforms' policies at the time.

Social media posts such as these—which are not based in fact—raise concern as they can harm trust in the electoral process. Some experts examining the general decline in trust in institutions seen in many Western democracies have posited that social media platform use contributes to this trend by reducing citizens' access to reliable information and increasing their access to inflammatory content.105

Questions to consider:

  • What additional role, if any, should Elections Canada play to build trust in elections and democracy? What role should other actors play in this area?

Footnotes

Footnote 54 We rely in part on the monitoring that Elections Canada staff did leading up to, during and after the 2019 election. Our team searched keywords on open platforms to find inaccurate information about the electoral process, particularly about when, where and ways to register and vote, feedback on our services, misuse of Elections Canada's identity and indicators of incidents, such as power outages, that could impede election delivery. Our focus was on finding items that could impact electors' access to the vote so we could remedy the situation quickly through operational and/or communications responses. We were not looking for CEA violations.

Footnote 55 Philippe C. Schmitter and Terry Lynn Karl, "What Democracy Is... and Is Not," Journal of Democracy (Summer 1991): 8. https://www.ned.org/docs/Philippe-C-Schmitter-and-Terry-Lynn-Karl-What-Democracy-is-and-Is-Not.pdf

Footnote 56 Elections Canada, "Election Advertising on the Internet," Written Opinions, Guidelines and Interpretation Notes: 2015-04, July 2015. https://elections.ca/content.aspx?section=res&dir=gui%2Fapp%2F2015-04&document=index&lang=e

Footnote 57 Ibid.

Footnote 58 This issue is canvassed in greater detail in Elections Canada's companion discussion paper:

Political Communications in the Digital Age, Discussion Paper 1: The Regulation of Political Communications Under the Canada Elections Act​, Elections Canada, May 2020.

Footnote 59 Mozilla, "Facebook and Google: This Is What an Effective Ad Archive API Looks Like," Press release, March 27, 2019. https://blog.mozilla.org/blog/2019/03/27/facebook-and-google-this-is-what-an-effective-ad-archive-api-looks-like/

Footnote 60 Netherlands Ministry of the Interior, "Response to the Motion for Complete Transparency in the Buyers of Political Advertisements on Facebook," Government of the Netherlands, 2019, as cited in Paddy Leerssen, Jeff Ausloos, Brahim Zarouali, Natali Helberger and Claes H. de Vreese, "Platform Ad Archives: Promises and Pitfalls," Internet Policy Review 8, no. 4 (2019): 3. Elections Canada maintains a repository of communication materials it publishes, including on social media. During the election period, the repository received 163,000 visits. See https://www.elections.ca/content.aspx?section=res&dir=pca&document=index&lang=e/

Footnote 61 Solon Barocas, "The Price of Precision: Voter Microtargeting and Its Potential Harms to the Democratic Process," PLEAD '12: Proceedings of the First Edition Workshop on Politics, Elections, and Data (November 2012): 31–36. https://www.researchgate.net/publication/266653119_The_price_of_precision_Voter_microtargeting_and_its_potential_harms_to_the_democratic_process/

Footnote 62 Mozilla, "Facebook and Google: This Is What an Effective Ad Archive API Looks Like," Press release, March 27, 2019. https://blog.mozilla.org/blog/2019/03/27/facebook-and-google-this-is-what-an-effective-ad-archive-api-looks-like/

Footnote 63 See, for example, Nathaniel Gleicher, "Removing Coordinated Inauthentic Behavior from Georgia, Vietnam and the US," Facebook, December 20, 2019. https://about.fb.com/news/2019/12/removing-coordinated-inauthentic-behavior-from-georgia-vietnam-and-the-us/; Makena Kelly, "Facebook and Twitter Shutter Pro-Trump Network Reaching 55 Million Accounts," The Verge, December 20, 2019, https://www.theverge.com/2019/12/20/21031823/facebook-twitter-trump-network-epoch-times-inauthentic-behavior/

Footnote 64 Standing Committee on Access to Information, Privacy and Ethics, "Democracy Under Threat: Risks and Solutions in the Era of Disinformation and Data Monopoly," Report of the Standing Committee on Access to Information, Privacy and Ethics, December 2018, 42nd Parliament, 1st Session, 39–40. https://www.ourcommons.ca/DocumentViewer/en/42-1/ETHI/report-17/

Footnote 65 Kanishk Karan and John Gray, "Trudeaus and Trudeaun'ts :  Memes Have an Impact during Canadian Elections." DFRLab, November 19, 2019. https://medium.com/dfrlab/trudeaus-and-trudeaunts-memes-have-an-impact-during-canadian-elections-4c842574dedc/

Footnote 66 Taylor Owen, "The Case for Platform Governance," Centre for International Governance Innovation Papers Series, Paper No. 231, November 4, 2019, 13. https://www.cigionline.org/publications/case-platform-governance/. Interestingly, the Government of Canada has issued a Directive on Automated Decision Making, which applies to the use of artificial intelligence in service delivery decision making by Government of Canada entities. The objective of the Directive is that automated decision systems are used by the government in a way that "leads to efficient, accurate, consistent, and interpretable decisions made pursuant to Canadian law." The Directive does not apply to agents of Parliament, including Elections Canada. Government of Canada, "Directive on Automated Decision-Making," Treasury Board Secretariat, 2019. https://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=32592/

Footnote 67 Jacquelyn Burkell and Priscilla M. Regan, "Voter Preferences, Voter Manipulation, Voter Analytics: Policy Options for Less Surveillance and More Autonomy," Internet Policy Review 8, no. 4 (2019): 7.

Footnote 68 CEA, infra., ss. 350(1)(b), 429.1 and 430(1). Ad registries were created by Bell Media, CBC–Radio Canada, Facebook, The Globe and Mail, Postmedia and Rogers.

Footnote 69 Damian McCoy, "Facebook Advertising during the Canadian Federal Election," presented at Digital Ecosystem Research Challenge, Ottawa, Ontario, February 21, 2020. For more information on this study, see https://www.digitalecosystem.ca/report/

Footnote 70 Facebook, "Hard Questions: What's Facebook's Strategy for Stopping False News?" May 23, 2018. https://about.fb.com/news/2018/05/hard-questions-false-news/

Footnote 71 Shirin Ghaffary, "Twitter Is Considering Warning Users When Politicians Post Misleading Tweets," Vox, February 20, 2020. https://www.vox.com/recode/2020/2/20/21146039/twitter-misleading-tweets-label-misinformation-social-media-2020-bernie-sanders/; Lulu Garcia Navarro, "Managing Misinformation on Reddit," NPR, December 8, 2019. https://www.npr.org/2019/12/08/786039738/managing-misinformation-on-reddit/

Footnote 72 Nushin Rashidian et al., "Friend and Foe: The Platform Press at the Heart of Journalism," Tow Center for Digital Journalism, 2018, and Robert W. McChesney and Victor Picard, eds., Will the Last Reporter Please Turn Out the Lights: The Collapse of Journalism and What Can Be Done to Fix It, New York, NY: The New Press, 2011, quoted in Taylor Owen, "The Case for Platform Governance," CIGI Papers No. 231 (November 2019), 11. https://www.cigionline.org/sites/default/files/documents/Paper%20no.231web.pdf

Footnote 73 Rachel Aiello, "Feds unveil plan to tackle fake news, interference in 2019 election," CTV News, February 27, 2019. https://www.ctvnews.ca/politics/feds-unveil-plan-to-tackle-fake-news-interference-in-2019-election-1.4274273/; Government of Canada, "Digital Citizen Initiative," August 30, 2019. https://www.canada.ca/en/canadian-heritage/services/online-disinformation.html/

Footnote 74 Samarth Bansar, "The Patchwork of Policy Working to Fend off Misinformation," Centre for International Governance Innovation, October 4, 2019. https://www.cigionline.org/articles/patchwork-policy-working-fend-misinformation/. Media literacy is closely correlated with income/education, meaning that those who "need" it most are least likely to get it; see Samara Centre for Democracy, "Investing in Canadians' Civic Literacy: An Answer to Fake News and Disinformation," January 30, 2019. https://www.samaracanada.com/docs/default-source/reports/investing-in-canadians-civic-literacy-by-the-samara-centre-for-democracy.pdf?sfvrsn=66f2072f_4/

Footnote 75 Samara Centre for Democracy, "Investing in Canadians' Civic Literacy: An Answer to Fake News and Disinformation," The Samara Centre for Democracy, January 30, 2019, 3, 8. https://www.samaracanada.com/docs/default-source/reports/investing-in-canadians-civic-literacy-by-the-samara-centre-for-democracy.pdf?sfvrsn=66f2072f_4/

Footnote 76 Judd Legum, "The Daily Caller Uses Status as Facebook Fact-checker to Boost Trump," Popular Information, March 3, 2020. https://popular.info/p/the-daily-caller-uses-status-as-facebook/

Footnote 77 In a few instances, Elections Canada flagged inaccurate posts to social media platforms for their consideration. Between August and October 31, the agency flagged 28 instances of inaccurate information and/or impersonation on social media platforms. The platforms reviewed the posts in light of their terms of service and in some cases removed the posts or accounts.

Footnote 78 Many announcements are in anticipation of and in response to activities related to elections, notably the 2020 US presidential election. See, for example, Facebook's October 2019 announcement about the steps the platform is taking "to help protect" the 2020 US elections: https://about.fb.com/news/2019/10/update-on-election-integrity-efforts/; for Twitter, see https://about.twitter.com/en_us/advocacy/elections-integrity.html#us-elections/

Footnote 79 Sara Fischer, "Privacy Concerns Push People to Private, Group-based Platforms," Axios, February 12, 2019. https://www.axios.com/privacy-concerns-push-people-to-private-group-platforms-f8561226-4cb4-4def-b11a-031edf590dba.html/

Footnote 80 J. Clement, "Social Networking in Canada – Statistics & Facts," Statista.com, September 18, 2019. https://www.statista.com/topics/2729/social-networking-in-canada/ and https://www.statista.com/forecasts/998512/social-media-activities-in-canada. As a response to increased scrutiny following the misuse of Facebook data by Cambridge Analytica, the platform has begun touting "private" groups as the future of social networking on Facebook—however, such groups can concentrate misinformation. See Elizabeth Dwoskin, "Facebook says private groups are its future. Some are hubs for misinformation and hate," The Washington Post, July 5, 2019. https://www.washingtonpost.com/technology/2019/07/05/facebook-says-private-groups-are-its-future-some-are-hubs-misinformation-hate/; Kurt Wagner, "This Could Be the Beginning of the End for Facebook's Social Network," Vox.com, March 7, 2019. https://www.vox.com/2019/3/7/18254298/facebook-private-messaging-zuckerberg-questions-social-network-dying/; Nic Newman, Richard Fletcher, Antonis Kalogeropoulos and Rasmus Kleis Nielsen, "Reuters Institute Digital News Report 2019," Reuters Institute for the Study of Journalism, 2019, 18. https://reutersinstitute.politics.ox.ac.uk/sites/default/files/inline-files/DNR_2019_FINAL.pdf

Footnote 81 Elizabeth Dwoskin, "Facebook says private groups are its future. Some are hubs for misinformation and hate," The Washington Post, July 5, 2019. https://www.washingtonpost.com/technology/2019/07/05/facebook-says-private-groups-are-its-future-some-are-hubs-misinformation-hate/

Footnote 82 Casey Newton, "WhatsApp puts new limits on the forwarding of viral messages," The Verge, April 7, 2020. https://www.theverge.com/2020/4/7/21211371/whatsapp-message-forwarding-limits-misinformation-coronavirus-india/

Footnote 83 Ibid.

Footnote 84 CEA, infra, s. 93.

Footnote 85 CEA, infra, ss. 81 and 81.1.

Footnote 86 CEA, infra, ss. 335–348.

Footnote 87 Hunt Allcott and Matthew Gentzkow, "Social Media and Fake News in the 2016 Election," Journal of Economic Perspectives 31, no. 2 (2017): 211–35.

Footnote 88 Communications Security Establishment, "Cyber Threats to Canada's Democratic Process" (2017), 5, 28. https://cyber.gc.ca/sites/default/files/publications/cse-cyber-threat-assessment-e.pdf; Communications Security Establishment, "2019 Update: Cyber Threats to Canada's Democratic Process" (2019), 9, 13. https://cyber.gc.ca/sites/default/files/publications/tdp-2019-report_e.pdf

Footnote 89 Ibid.

Footnote 90 Michael Schwirtz and Sheera Frenkel, "In Ukraine, Russia tests a new Facebook tactic in election tampering," The New York Times, March 29, 2019. https://www.nytimes.com/2019/03/29/world/europe/ukraine-russia-election-tampering-propaganda.html/

Footnote 91 In November 2019, Facebook announced that it had removed 5.4 billion fake accounts in 2019, an increase over the 3.3 billion fake accounts removed in 2018. In mid-2019, Facebook estimated that 5% of its current active monthly users are fake; however, outside observers estimate that figure to be much higher—potentially as high as 20%. It has been estimated that 8% or 9% of Instagram accounts are fake. From May to July 2018, Twitter shut down approximately 70 million fake or suspicious accounts. At the time, this represented approximately 15% of Twitter's total user base. Matt Davis, "Billions of Fake Accounts: Who's Messaging You on Facebook?" Big Think, November 20, 2019. https://bigthink.com/politics-current-affairs/facebook-banned-accounts/; Reed Albergotti and Sarah Kuranda, "Instagram's Growing Bot Problem," The Information, July 18, 2018. https://www.theinformation.com/articles/instagrams-growing-bot-problem/; Instascreener, "Quantifying Fake Accounts and Inauthentic Engagements on Instagram," Instascreener, June 2019. https://instascreener.com/blog/quantifying-fake-accounts-inauthentic-engagements/; BBC News, "Twitter 'shuts down millions of fake accounts,'" July 9, 2018. https://www.bbc.com/news/technology-44682354/

Footnote 92 For example, Kim Weaver, an Iowa Democrat, dropped out of a US House of Representatives race in 2017, citing threats to her safety as one of the primary reasons. See Molly Longman and Jason Noble, "Kim Weaver withdraws her candidacy in Iowa's 4th District race for Congress," Des Moines Register, June 3, 2017. https://www.desmoinesregister.com/story/news/2017/06/03/kim-weaver-withdraws-her-candidacy-iowas-4th-district-race-congress/368389001/

Footnote 93 Janie Gosselin, « Violence en ligne contre les femmes : des messages crus lu à Quebec » , La Presse, November 28, 2019. https://www.lapresse.ca/actualites/politique/201911/28/01-5251538-violence-en-ligne-contre-les-femmes-des-messages-crus-lus-a-quebec.php/

Footnote 94 Scott Shane and Sheera Frenkel, "Russian 2016 influence operation targeted African-Americans on social media," The New York Times, December 17, 2018; Paul M. Barrett, "Tackling Domestic Disinformation: What the Social Media Companies Need to Do," NYU Stern Center for Business and Human Rights, February 22, 2019, 5. https://issuu.com/nyusterncenterforbusinessandhumanri/docs/nyu_domestic_disinformation_digital/

Footnote 95 Paul M. Barrett, "Tackling Domestic Disinformation: What the Social Media Companies Need to Do," NYU Stern Center for Business and Human Rights, February 22, 2019. https://issuu.com/nyusterncenterforbusinessandhumanri/docs/nyu_domestic_disinformation_digital/

Footnote 96 Ibid.

Footnote 97 Balázs Bodó, Natalie Helberger, Sarah Eskens and Judith Möller, "Interest in Diversity: The Role of User Attitudes, Algorithmic Feedback Loops, and Policy in News Personalization," Digital Journalism 7, no. 2 (2019): 206–29, at 207, as cited in Jacquelyn Burkell and Priscilla M. Regan, "Voter Preferences, Voter Manipulation, Voter Analytics: Policy Options for Less Surveillance and More Autonomy," Internet Policy Review 8, no. 4 (2019): 6; Hunt Allcott and Matthew Gentzkow, "Social Media and Fake News in the 2016 Election," Journal of Economic Perspectives 31, no. 2 (2017): 211–35, at 221.

Footnote 98 Eli Praiser, The Filter Bubble: What the Internet Is Hiding from You (London: Penguin, 2011), as cited in Seth Flaxman, Sharad Goel and Justin M. Rao, "Filter Bubbles, Echo Chambers, and Online News Consumption," Public Opinion Quarterly 80 (2016): 298–320, at 299.

Footnote 99 C. Thi Nguyen, "Echo Chambers and Epistemic Bubbles," Episteme (2018): 1–21; Cass R. Sunstein, Republic 2.0 (Princeton, NJ: Princeton University Press, 2009) and Eli Praiser, The Filter Bubble: What the Internet Is Hiding from You (London: Penguin, 2011), as cited in Seth Flaxman, Sharad Goel and Justin M. Rao, "Filter Bubbles, Echo Chambers, and Online News Consumption," Public Opinion Quarterly 80 (2016): 298–320, at 299. It is worth noting that "filter bubbles" and "echo chambers" are contested concepts that are not settled in the scholarship.

Footnote 100 Jeffrey A. Karp, Alessandro Nai and Pippa Norris, "Dial 'F' for Fraud: Explaining Citizens' Suspicions about Elections," Electoral Studies 53, no. 2 (2018): 11–19, at 17; Sarah Birch, "Perceptions of Electoral Fairness and Voter Turnout," Comparative Political Studies 43, no. 12 (December 2010): 1601–22, at 1615–16.

Footnote 101 Tiffany Lizée, "Social media plays major role in 2019 federal election," Global News, October 21, 2019. https://globalnews.ca/news/6060008/social-media-federal-election/

Footnote 102 Following the 2015 federal election, 92% of surveyed voters felt that Elections Canada ran the election fairly, and 92% had a high level of trust in the accuracy of the results (Elections Canada, "Survey of Electors Following the 42nd General Election." https://www.elections.ca/content.aspx?section=res&dir=rec/eval/pes2015/surv&document=p11&lang=e)/. Compared to other countries, the level of trust in elections in Canada (71%) is higher than in other OECD countries, including the United Kingdom (65%), Denmark (55%), France (54%) and the United States (37%). Gallup, "Gallup World Poll" (2017), cited in Keith Neuman, "Canadians' Confidence in National Institutions Steady," Policy Options (2018). https://policyoptions.irpp.org/magazines/august-2018/canadians-confidence-in-national-institutions-steady/

Footnote 103 Proof Inc., "CanTrust Index 2019." https://www.getproof.com/thinking/the-proof-cantrust-index/. Between 2016 and 2019, Canadians' trust in news media declined by 14%, trust in governments declined by 4% and trust in leaders declined by 10%. In 2019, only half of Canadians found the electoral system to be "fair and representative."

Footnote 104 Ashley Burke, "Elections Canada tried to beat back 'implausible' online rumours about pencils spoiling ballots," CBC News, November 9, 2019. https://www.cbc.ca/news/politics/disinformation-pencil-smudging-ballot-election-2019-1.5353018/

Footnote 105 Jennifer Kavanagh and Michael D. Rich, Truth Decay: An Initial Exploration of the Diminishing Role of Facts and Analysis in American Public Life (Santa Monica, CA: RAND Corporation, 2018). https://www.rand.org/pubs/research_reports/RR2314.html/; Uri Friedman, "Trust Is Collapsing in America," The Atlantic, January 21, 2018. https://www.theatlantic.com/international/archive/2018/01/trust-trump-america-world/550964/