Arguments

Saving Democracy in the Digital Age

Big tech poses an existential threat to our democracy; the time for self-regulation must end.

By Clara Hendrickson

Tagged DemocracyFacebookRussiatechnologyTrump Administration

The political consensus that has long safeguarded big tech from regulatory scrutiny is beginning to erode. A bombshell New York Times investigation recently revealed that Facebook failed to disclose Russian interference on its platform when it became aware of such activity. The investigation also found that, in an effort to protect itself from public scrutiny, Facebook hired a Republican-linked opposition firm to discredit the social media giant’s critics and competitors. This has led Democrats to call for a Justice Department investigation and bolstered calls for new regulations to rein in the power of big tech. Republicans, for their part, have long worried about an anti-conservative bias in Silicon Valley and, just a couple of months ago, the Department of Justice considered opening a probe to explore whether tech platforms have violated consumer protection or antitrust laws. The era of self-regulation seems to be coming to an end for big tech. Legislative action in the next Congress might entail efforts toward greater transparency requirements for online political advertisements and user data protection measures.

But there are a number of challenges that cannot easily be met with regulatory action or enforcement (should lawmakers eventually pursue it) that will continue to plague American democracy in the digital age. Hacking attempts by foreign actors show no signs of abetting, raising questions about the use of hacked material in political campaigns. Fake news continues to find its way onto social media platforms and the technology to crack down on disinformation lags behind what one might hope. Confronting these challenges will require creative problem-solving.

Hacking attempts targeted campaigns ahead of the midterm elections, including Senator Claire McCaskill’s reelection bid as Missouri senator, yet few campaigns have invested in cybersecurity safeguards since the 2016 election, such as email security and website intrusion monitoring. But perhaps more troubling, while the Democratic Congressional Campaign Committee has vowed not to use stolen or hacked materials in their campaigns, Republicans, as a whole, have refused to do the same. But just because the NRCC refused to sign onto this, doesn’t mean that some individual Republicans haven’t. One Republican political strategist recalls deciding not to use hacked material to help his Republican candidate win: “When news broke that this material had likely been stolen by a foreign actor, we immediately said, ‘We’re not going to use it.’” Unfortunately, these cases have been few and far between.

Unless the political incentives confronting candidates on the campaign trail are disrupted, candidates will be tempted to use hacked material against their opponent whatever the cost to democracy. The DCCC’s cyber pledge committing the party to refrain from using stolen or hacked material in campaigns is a step in the right direction, but is weakened by the fact that the agreement does not bind both parties. And as The Atlantic’s Natasha Bertrand reports, “experts fear that the continued use of hacked documents by campaigns only encourages cybercriminals to keep meddling in U.S. elections.” Circulating hacked material as part of a campaign strategy not only leaves the country vulnerable to foreign interference, though, it also empowers voters’ bad behaviors. If current and aspiring politicians don’t care about the source of the information they disseminate, why should American voters care about vetting the truthfulness of the news stories they come across online that confirm their strongly held worldview and disparage their political opponent?

Meanwhile, it is also unclear whether such a pledge could ever be codified into campaign law. An attempt to make illegal the use of hacked information in campaigns could be vulnerable to free speech arguments like the one Donald Trump’s campaign lawyers are currently using in a lawsuit brought forward by DNC staffers hacked during the 2016 election.

Counteracting fake news, like counteracting the threat of foreign interference, will require new policies fit for democracy in the twenty-first century, rather than simply technological advancements, which are also not nearly as sophisticated at this point as some might expect. Speaking before Congress earlier this year, Mark Zuckerberg discussed plans to use artificial intelligence to detect fake news, but as Facebook’s own data scientist has admitted, “A.I. cannot fundamentally tell what’s true or false—this is a skill much better suited to humans.” Improving automated fake-news detection requires “a fundamentally new A.I. paradigm,” psychology and neural science professor Gary Marcus and computer science professor Ernest Davis recently wrote in The New York Times. A.I. currently relies on keyword associations to detect fake news, and while computer programmers may one day develop an A.I. that can unpack the key concepts and ideas contained in news stories, such an advancement remains a ways off.

So, while the algorithmic detection of fake news remains underdeveloped, social media sites will have to rely on more humans vetting content on the platform—a responsibility Facebook has, until now, mostly outsourced to its users. But there’s something else that can help, also in the long run: mandating a digital literacy curriculum as an essential part of a much-needed civic education. Sam Wineburg, the author of a 2016 Stanford report on youth digital literacy, argues that students should be taught, starting in middle school, basic fact-checking skills to evaluate websites and news stories, “the equivalent of looking in the rearview mirror when switching lanes on the highway.” Some states are beginning to consider such an idea. While some of them have provided resources for such an education, few have taken steps to truly incorporate media literacy into the state’s curriculum. For example, California State Senator Bill Dodd introduced a bill recently that would have mandated such media literacy education as part of the state’s curriculum. Unfortunately, it failed, just last year, to pass the legislature. Instead, the state passed a law that will make media literacy resources available to teachers online along with teacher development opportunities. But without mandating digital literacy training and teaching, the opportunities introduced by the law may be underutilized. It’s a step in the right direction, but it’s still not enough if we are to truly confront, today, one of the gravest challenges democracy faces in the digital age.

As we’ve seen, the past two years have put into clear view the need for regulatory action against big tech. In addition to reigning in the power of today’s dominant platforms, our tech age also asks for leaders who feel called to protect our democracy from foreign interference, even if hacked information presents a tempting opportunity to quash one’s political opponent. Similarly, American citizens, through a true civics education, should be equipped with the skills needed to confirm the veracity of the information they come across, with the knowledge that democratic discourse in a democracy requires preserving a collective commitment to truth. Ultimately, though, we must keep in mind the limitations of these tactics, as the persuasive power of today’s digital technologies originates from our own appetite for political extremism. As Zeynep Tufekci writes, “Our own domestic political polarization feeds into the basic business model of companies like Facebook and YouTube.” While policy alone cannot eliminate the internal divisions that exacerbate the threat digital technologies pose to democracy, regulatory action can disrupt the market incentive to divide us further and help counter the culture that has made promoting such polarization so profitable.

Read more about DemocracyFacebookRussiatechnologyTrump Administration

Clara Hendrickson is a Research Analyst in Governance Studies at the Brookings Institution. The opinions expressed in this piece are her own.

Click to

View Comments

blog comments powered by Disqus