Argument
An expert's point of view on a current event.

Our Data, Ourselves

How to stop tech firms from monopolizing our personal information.

John Tomac illustration for Foreign Policy
John Tomac illustration for Foreign Policy
John Tomac illustration for Foreign Policy

Concentrated in a few hands, big data is a threat to democracy. Social media companies and political data-mining firms such as Cambridge Analytica have built their businesses by manipulating public life using personal data. Their work has helped heighten ethnic tensions, revive nationalism, intensify political conflict, and even produce new political crises in countries around the world — all while weakening public trust in journalism, voting systems, and electoral outcomes.

Concentrated in a few hands, big data is a threat to democracy. Social media companies and political data-mining firms such as Cambridge Analytica have built their businesses by manipulating public life using personal data. Their work has helped heighten ethnic tensions, revive nationalism, intensify political conflict, and even produce new political crises in countries around the world — all while weakening public trust in journalism, voting systems, and electoral outcomes.

Such crises are symptoms of a deeper problem: the effective monopoly that a handful of technology firms have gained over a wealth of information relevant to public life. Fixing the situation requires putting the public back in charge of its data.

Democracy has long been predicated on, and reinforced by, social institutions that carefully collect information about public life and collective needs. Today, however, a handful of technology companies have far exceeded the data-gathering capacity of all other kinds of organizations. These private firms possess detailed information on the public — and having collected and stored data on every user’s attitudes, aspirations, and behaviors, they then use it to serve their bottom line. Social media platforms are designed to deliberately exploit the common predilection for selective exposure — the tendency to favor information that confirms pre-existing views — to reinforce messaging from advertising clients, lobbyists, political campaign managers, and even foreign governments.

There are two ways to protect democracy from the challenge posed by tech companies’ dominance over socially valuable data. The first option is for governments to regulate content on an unprecedented scale. That would oblige public regulators to either review all social media content to judge its appropriateness or provide clear signals to private firms — whether the social media companies themselves or third parties — to perform such content reviews. But the problem with both scenarios is that they would create massive new censorship mechanisms that would further threaten democratic culture.

Far preferable would be market regulations that guide firms on how and when they can profit from information about individuals. Such regulations would put the public back in charge of a valuable collective resource while still allowing citizens to express themselves individually by deciding what to do with their data. To get there, policymakers should focus on five basic reforms, all of which would put public institutions back into the flow of data now dominated by private firms.

First, governments should require mandatory reporting about the ultimate beneficiaries of data. That means, when queried, technology firms should be required to clearly report to users which advertisers, data miners, and political consultants have made use of information about them. Your Facebook app or your smart refrigerator should be required to reveal, on request, the list of third parties benefiting from the information the device is collecting. The trail of data should be fully, and clearly, mapped out for users so that if a data-mining firm aggregates users’ data and then sells it on to a political party, the users could still identify the ultimate beneficiary.

Second, regulations should require social media platforms to facilitate data donation, empowering users to actively identify the civic groups, political parties, or medical researchers they want to support by sharing their data with them. In freeing data from private actors, governments could create an opportunity for civic expression by allowing citizens to share it with whichever organizations and causes they want to support — not just the ones that can afford to buy it, as is the case today.

The third reform is related to the second: Software and information infrastructure companies should be obliged to tithe for the public good. Ten percent of ads on social media platforms should be reserved for public service announcements, and 10 percent of all user data should be obliged to flow (in a secured way) to public health researchers, civic groups, professional journalists, educators, and public science agencies. Such a system would allow many kinds of advocacy groups and public agencies, beyond Facebook’s private clients, to use existing data to understand and find solutions for public problems.

Fourth, the nonprofit rule on data needs to be expanded. Most democracies have rules that prevent firms from profiting from the sale of certain kinds of public data. In many U.S. states, for example, data-mining firms can’t profit from the sale of voter registration data, which public agencies collect. This rule needs to be extended to a wider range of socially valuable data, such as places of employment, that is now gathered by technology companies. Such classes of information could then be passed to public agencies, thus creating a broader set of data in the public domain.

Fifth, public agencies should conduct regular audits of social media algorithms and other automated systems that citizens now rely on for information. Technology companies will call these algorithms proprietary, but public agencies currently audit everything from video gambling machines to financial trading algorithms, all in ways that don’t violate intellectual property.

Users should have access to clear explanations of the algorithms that determine what news and advertisements they are exposed to, and those explanations should be confirmed by regular public audits. Moreover, all ads, not just political ones, need to be archived for potential use by public investigators. Audits of today’s technology would also put the designers of new technologies — such as artificial intelligence — on notice that their own algorithms will one day be under scrutiny.

Little of this need be wishful thinking. Restoring public access to social information wouldn’t require legislators to pass a raft of new laws, since most democracies have the public science agencies, libraries, and privacy czars needed to effectively administer large collections of public information. Competition regulators in the European Union and United States may already have the authority to set mandatory guidelines for any technology company with a business model that relies on controlling vast stores of publicly valuable data. Europe’s General Data Protection Regulation, which has boldly asserted an individual right to control data since going into effect in May, is an important start. It is already having a global impact, as many technology firms find it easier to implement a platformwide response than to adjust particular features for users based in Europe.

Tech firms might claim that such demands would infringe on their economic rights as private enterprises. But contrary to such suggestions, it’s entirely fair to regulate the operations (if not the content) of tech firms because the platforms they control have become the fundamental infrastructure for public life. They are a common carrier for our political culture, much the same way the post office, newspaper empires, and television and radio broadcasters conveyed politics in past decades while being regulated to varying degrees.

In democracies, citizens expect media companies, journalists, and civic groups to have some public duties, often enforced through the law. Social media and data-mining firms have evaded those responsibilities until now, hoarding public data with little public oversight. Strengthening democracy will require putting socially valuable data back to work for the public good.

This article originally appeared in the July 2018 issue of Foreign Policy magazine.

Philip N. Howard teaches at Oxford University’s Balliol College and is the director of the Oxford Internet Institute. He writes about information politics and international affairs, and is the author of eight books, including New Media Campaigns and the Managed Citizen and Pax Technica: How the Internet of Things May Set Us Free or Lock Us Up. He has won multiple book awards was named him a Global Thinker by Foreign Policy in 2017. Twitter: @pnhoward

More from Foreign Policy

A man walks in front of an animated map of the world at an exhibition at the Hong Kong Science Museum.
A man walks in front of an animated map of the world at an exhibition at the Hong Kong Science Museum.

China Is Selectively Bending History to Suit Its Territorial Ambitions

Beijing’s unwillingness to let go of certain claims suggests there’s more at stake than reversing past losses.

Benjamin Netanyahu walks past Joe Biden as he prepares to sign the guestbook at the Prime Minister's residence on March 9, 2010 in Jerusalem.
Benjamin Netanyahu walks past Joe Biden as he prepares to sign the guestbook at the Prime Minister's residence on March 9, 2010 in Jerusalem.

The United States Has Less Leverage Over Israel Than You Think

A close look at the foundations of U.S. influence—and the lack of it.

Iranian Supreme Leader Ayatollah Ali Khamenei speaks after casting his ballots at a polling station in Tehran on March 1.
Iranian Supreme Leader Ayatollah Ali Khamenei speaks after casting his ballots at a polling station in Tehran on March 1.

Khamenei’s Strategy to Dominate the Middle East Will Outlive Him

Iran’s aging supreme leader is ensuring that any successor will stay the course.

A photo collage illustration of a finger made of bits of contracts and pieces of tech to represent a large corporation, pushing down on the American flag.
A photo collage illustration of a finger made of bits of contracts and pieces of tech to represent a large corporation, pushing down on the American flag.

America Has a Resilience Problem

The chair of the Federal Trade Commission makes the case for competition in an increasingly consolidated world.