Hussman’s Barrett and Kreiss join Stanford expert panel on tech and 2020
by Barbara Wiedemann
UNC Hussman School of Journalism and Media Associate Professor Daniel Kreiss and Ph.D. student Bridget Barrett were two of six panelists at a December “Digital Technology, Social Media and the 2020 Presidential Election” conference held by Stanford University’s Cyber Policy Center (CPC) to explore the effect of digital technologies on the 2020 election. Participants from major research centers and non-profits reviewed their most important take-aways from the 2020 election. From disinformation to supporting local election officials during a pandemic to political advertising, the day covered all the biggest issues in digital politics and gave the audience an idea of the complexities of running a secure election.
The Hussman political communication experts addressed a virtual audience of 250 from around the country. Kreiss and Barrett are also affiliated with UNC-Chapel Hill’s Center for Information, Technology, and Public Life (CITAP). They directed attendees to their research at citapdigitalpolitics.com for details of their findings.
CPC Executive Director Kelly Born hosted the conference at a brisk pace to allow for research findings and a short Q&A session from each of the presenters.
An in-depth panel discussion with high-level representatives from Google, Facebook, Twitter and TikTok closed the conference.
Nathaniel Persily, CPC’s faculty co-director and the James B. McClatchy Professor of Law at Stanford Law School, began with his work with healthyelections.org, an academically-grounded effort to help local election officials adapt to the pandemic and provide a safe and secure 2020 Presidential election.
“We forget where we were last spring,” he reminded attendees, “when it looked like the chance of a meltdown existed.”
“Election officials really stepped up,” said Persily, noting that with $400 million from Congress and generous philanthropic money to sustain and buttress that spending, the U.S. was able to handle the most significant change in national election infrastructure in the shortest period of time ever. He cited an unprecedented level of early voting, minimal problems with voting by mail, very few polling place problems, record high turnout overall. The resulting paradox: a very successful national election across 10,000 different election jurisdictions, which nevertheless resulted in more than 400 legal cases challenging nearly every aspect of the election process.
A question about future federal elections brought the focus back to funding.
“We cannot have a bake sale for democracy every four years,” said Persily. “We can’t have private philanthropy kicking in half a billion dollars all told.”
Erika Franklin Fowler, director of the Wesleyan Media Project, helps track and analyze political advertising in real-time during elections. She reviewed a blizzard of recent data her team has amassed by tracking political advertising spending in the 2020 election, reminding participants that the $14 billion in overall spending on U.S. elections this year dwarfs the roughly $6 billion spent in 2012 and 2016.
Fowler and her research team saw documented fewer attack ads and more promotional ads this cycle than in recent prior elections. Surprisingly to some, her project observed more negative ads on television versus digital media in 2020; and more negative ads from congressional campaigns than presidential.
Her team continues to dig into the data for more granular insights, like the fact that Spanish-language advertising on Facebook was approached differently by the presidential campaigns, heavily used by the Trump campaign in mid-October and then more so by the Biden campaign as the election approached.
Laura Edelson launched the NYU Ad Observatory with Associate Professor Damon McCoy. The team builds tools to collect and archive online political advertising data and makes that data available publicly — with a focus on Facebook ads.
“Our focus is on this [false or misleading political ads] as a cybersecurity problem to solve,” said Edelson.
The project, which uses user-submitted data in addition to the data Facebook provides to the public, was met with some pushback, notably from Facebook. Facebook has recently threatened legal action if the team doesn’t stop its work.
The vast majority of digital political ad spending is very normal behavior by honest actors, the researchers found, but there is also much content denigrating mail-in voting (which Facebook prohibited in the second week of October) and our democratic system as a whole, which was also banned but tougher to track.
“It was coming from institutions and offices with a lot of historical trust,” said Edelson, like President Trump and other candidates. Facebook’s policy of carving out candidates’ ability to run false content in ads exacerbated the problem, she said.
The problems pointed the way to what Edelson called “some easy wins for the future.”
“We really need universal ad transparency,” she recommended, for other platforms and for Facebook, via a third party or government source. “We really do need enhanced regulation of targeting and disclosure,” she added, pointing to regulations of other mediums, like broadcast television for example, as a model. “And we see many cases where there are good policies in place but they don’t appear to be enforced or they are limiting and don’t prioritize the interests of users in not being subjected to content that promotes false beliefs.”
Later, she added “Citizen science projects, where users can contribute ads that they’ve observed, are going to be a path forward in the meantime.”
Tiana Epps-Johnson is the founder and executive director of the Center for Technology and Civic Life (CTCL) in Chicago, in the news in September for receiving $350 million in donations from Dr. Pricilla Chan and Mark Zuckerberg to shore up safe and reliable voting in the 2020 elections.
Epps-Johnson was on the frontlines of helping local election officials in the midst of the chronic underfunding and the threats of a global pandemic.
In response, CTCL developed a 3-part cybersecurity training tailored to those officials, which was purchased by the US Election Assistance Commission for nationwide participants. The group also created a 4-course series of communications workshops (“how to improve your election website” for example). Additionally, CTCL produced a 12-webinar series to provide COVID-19 information to local officials.
Epps-Johnson pointed out the discrepancy between a $400 million appropriation from Congress for the 2020 elections and the $4 billion experts estimated would be needed to ensure a successful pandemic-era election. In response, her group launched the CTCL COVID-19 response grant program. The grants provided things like mail processing equipment, thermometers, drop boxes, ballot retrieval teams, ADA improvements like ramps—all of which are documented here.
“What was expected to be an election catastrophe actually defied expectations,” Epps-Johnson said, calling out the efforts of election officials, activists and people stepping up to be first-time poll workers. “We saw an overwhelmingly smooth process at the polls.”
In closing, Epps-Johnson noted a lack of partisan divide amongst the local election officials CTCL worked with this year.
“I was working with Republicans and Democrats and folks all across the political spectrum. There is power in that. The field of election officials are not divided. They understand that as a whole to run good elections, they need to be funded in a different and more sustainable way.”
UNC Hussman Associate Professor Daniel Kreiss is a principal investigator at UNC-Chapel Hill’s Center for Information, Technology, and Public Life (CITAP). UNC Hussman Ph.D. student and Park Fellow Bridget Barrett is research lead for the digital political advertising project. The duo talked about their work tracking how each of the different platforms manage policies around speech issues, and the democratic trade-offs that these different policies present.
What should platforms be solving for, Kreiss asked, when they are developing policies around defending democratic norms? And what should platforms be held responsible for when we’re talking about evidence of democratic decay?
Degrading constitutional norms or weakening of norms around electoral fairness or partisan cooperation are key and compelling threats to democracy, he posited, but platforms are not directly responsible. He argued that social media platforms may not play a primary role but they are indirectly responsible for enabling: an erosion of tolerance among political rivals, particularly in ensuring free and fair elections; efforts to undermine the legitimacy of political opposition; and the growing polarization in America, an “us vs. them” attitude.
Barrett talked through policy changes made by platforms recently and how UNC Hussman researchers think these changes address the platforms’ role in what she called democratic backsliding.
She summarized that election delegitimization policies were expanded in 2020 by giving easy ways to find accurate information and labeling false/misleading information, and by creating new rules that disallowed the delegitimization of the election process.
In discussing platforms’ COVID misinformation policies, Barrett pointed out that research highlights that democracies suffer when trust in institutions suffers, including trust in public health authorities. She argued that the platforms’ execution on Covid-19 misinformation was so problematic, that it debatably made the situation worse.
Regarding platform policies on political ads, Barrett talked about Twitter’s at-large ban on political ads and Facebook/Google’s bans post-election. “That’s a policy update that by striving to remain neutral, solved for very few problems,” she said.
Asked about next steps in light of the 2020 election, Kreiss noted that “It helps to be clear-eyed about what platforms are solving for. And to think clearly about the possible consequences.”
Kreiss directed participants to CITAP’s work online about platforms' speech policies; a summary and analysis of platforms’ stated politics ethics (written with three other faculty from across Carolina); and platforms' political advertising capabilities and policies.
Alex Stamos and Renee DiResta presented initial findings from the Internet Observatory’s Election Integrity Partnership on election disinformation trends and content in 2020.
“Our objective is to detect and mitigate attempts to deter voting or delegitimize election results,” said Stamos, noting that their goal was to have real-time impact.
He outlined the government, civil society, platform and media partners. He noted how the partnership gathered data in real-time across five four-hour shifts in the weeks before and after the election to pull together evidence of electoral procedural Interference; participation interference; fraud; and the delegitimization of election results — the latter being the majority of what they saw in their 2020 election data. They developed over 1,000 themes in disinformation, with archived content which was then shared with platforms.
“Often one key influencer would have the power to make [disinformation] go viral,” said DiResta, using Fox News broadcaster Tucker Carlson’s discussion of ‘Color Revolutions’ as an example of how conspiracy theories were spread, noting that in today’s information ecosystem, it is often hard to separate the two actors most actively sharing misinformation on the 2020 elections: influencers and hyper-partisan media.
EIP saw a real demand for the disinformation the documented. DiResta observed that ordinary people were something of a digital army who were actively participating in spreading it once a trusted influencer had shared misinformation.
“This has implications for moderation and responses,” she noted.
DiResta outline some of the tactics and strategies seen in 2020, reiterating the fact that in the 2020 election, foreign actors were less effective than “blue check” influencers.
Stamos articulated some tactics that platforms might consider in light of their data. One suggestion was to apply the rules about misinformation to high-end actors more rather than less aggressively. Across all platforms, he noted, people with the largest audiences did not have the rules fairly applied to them in 2020. He urged platforms to set fair rules and follow up with a progression of punishment, to motivate users to no longer push disinformation or take them out of a trending disinformation campaign.
“This wasn’t an algorithm problem. It wasn’t an artificial amplification problem. It wasn’t a bot problem,” said Stamos. “It’s not a fake account problem.”
“If you have 1 million viewers on your YouTube stream, you are no longer a normal person who has a free expression right,” said Stamos. “You are a cable channel. The companies have to take more editorial responsibility for that.”
The conference closed with a robust panel discussion with participants Nathaniel Gleicher, head of cybersecurity policy at Facebook; Clement Wolf, global public policy lead for information integrity at Google; Yoel Roth, director of site integrity at Twitter; and Eric Han, head of safety at TikTok – well worth a listen at 2:20 into the conference.