The Washington PostDemocracy Dies in Darkness

Sex, drugs, and self-harm: Where 20 years of child online protection law went wrong

June 13, 2019 at 8:00 a.m. EDT
Many tech companies have side-stepped COPPA, the Children’s Online Privacy Protection Act, a 1998 law that aims to protect children from harmful online materials. (Washington Post illustration/iStock/Getty Images)

Two decades after Congress tried to wall off the worst of the Internet in hopes of protecting the privacy and innocence of children, the ramparts lie in ruins.

Sex, drugs, violence, hate speech, conspiracy theories and blunt talk about suicide rarely are more than a few clicks away. Even when children are viewing benign content, they face aggressive forms of data collection that allow tech companies to gather the names, locations and interests of young users.

The federal government’s efforts to thwart these rising threats have been weakened by court rulings, uneven enforcement and the relentless pace of technological change. Surveys show that four out of five American preteens use some form of social media, with YouTube being the most popular but Instagram, Facebook and Snapchat also widely used — even though all four services officially prohibit users younger than 13.

Other popular online offerings — such as the game Fortnite, which has proved to be so engrossing to preteen boys that parents worry about addiction — maintain they are “not directed” at children. But the services also don’t ask users how old they are. This tactic, lawyers say, helps the companies sidestep the Children’s Online Privacy Protection Act, a 1998 law known by the acronym COPPA that restricts the tracking and targeting of those younger than 13 but requires “actual knowledge” of a user’s age as a key trigger to enforcement.

Consumer and privacy advocates have alleged rampant COPPA violations by leading technology companies, including in a highly detailed 59-page complaint against YouTube last year. Even when federal authorities take action, the penalties typically come many years after the violations began and make little dent in corporate profit margins, the advocates say.

“We’ve got a crisis of enforcement,” said Josh Golin, executive director of the Campaign for a Commercial Free Childhood, an advocacy group based in Boston. “Right now we are incentivizing companies to not know that children are on their sites. . . . They’ve literally been rewarded for pretending that there are no children on their sites.”

As researchers and consumer advocates spotlight the weaknesses of federal protections for children, some members of Congress are pushing to toughen the federal privacy law and to impose legal restrictions on what can be shown to children online. But such efforts are struggling to advance in a Congress consumed by partisan battles.

Safeguards by industry, which for years argued that “self-regulation” was an effective alternative to government’s heavy hand, also have proved weak. Even content that nearly everyone agrees should be off-limits to children, such as pornography and sites celebrating drug, tobacco and alcohol consumption, can be seen by underage users who enter fake birth dates or tap online buttons that allow them to claim to be adults. Rarely do sites or apps employ systems that routinely verify ages.

This leaves parents with few choices for helping kids navigate an online world in which the borders between sites for adults and those for children have all but disappeared. Short of round-the-clock vigilance — in play rooms, on school buses, wherever children gather with their ever-present mobile devices — there are few effective ways to shield them from corporate data collection or from encountering content traditionally kept from young eyes.

“There has been a complete and utter failure to protect children in this entire society by the Washington infrastructure,” said James Steyer, chief executive of Common Sense Media, a San Francisco-based advocacy group pushing for several new measures to make the Internet safer for children. “It’s a disgrace. And the losers have been children, parents and their families.”

Shocking discoveries

One children’s website, roman-numerals.org, featured educational games, cartoon characters in togas and a decidedly adult advertisement along the bottom, a Princeton researcher recently found. In the ad a dark-haired woman in a low-cut dress smiled warmly just above the words “Ashley Madison,” with a link to the online dating service whose slogan is “Life is short. Have an affair.”

Minutes later, on a different children’s math site, another Ashley Madison ad appeared, only this time the woman’s hair was curlier and the dress more revealing.

Delivering the ads on both occasions was Google, the world’s largest digital advertising company, which acknowledged in a statement that the ads violated company policy.

“I was shocked,” said Gunes Acar, the researcher for Princeton’s Center for Information Technology Policy who discovered the ads. “I definitely was not expecting to find that on a child-directed site.”

YouTube says it bans preteens from its site. But it’s still delivering troubling content to young children.

The shocks would keep coming for him during several weeks last winter reviewing the ads on children’s websites.

Acar was studying the effectiveness of COPPA, which was once hailed as a landmark in protecting kids online. But recent research by Acar and others has demonstrated significant limits in the reach and enforcement of COPPA, suggesting that the law has been overrun by the very industry it was supposed to regulate.

In addition to the Ashley Madison ads, Acar’s survey of websites labeled as “child-directed” found a Google ad for a dating service featuring Qatari women and another touting pictures of “Hot Survivor Contestants.” Some ads served by Google offered downloads that included malicious software. Another Google ad caused his computer to emit a high-pitched alarm as a robotic voice announced an “Important security message” and urged him to call a number for tech support — all signs of a likely online scam.

All of these ads complied with COPPA, meaning they didn’t track or target children. But the law also had another apparent effect, one not intended by its creators: By barring personalized advertising, COPPA can prompt advertising companies to deliver a hodgepodge of untargeted ads on children’s sites, resulting in a mix that can be curiously adult in nature.

Acar’s survey involved repeatedly visiting children’s websites while he was not signed into any Google service, so that he could see what advertising appeared. He also collected Google’s explanations of why it displayed certain ads, to make sure that the factors weren’t particular to his browsing history or anything else that might indicate an adult user. Google’s explanations of the Ashley Madison ads, for example, indicated that they were not personalized to Acar but were displayed for general reasons, such as his location, in Princeton, N.J., and the time of day.

Google said that it has policies against ads delivering malicious software and that it does not allow adult advertising on children’s sites.

“Our policies prohibit serving personalized ads to children. We also restrict the kind of ad content that is served on primarily child-directed sites and in this case, our systems didn’t work as intended and we have removed those ads,” Google spokeswoman Shannon Newberry said.

John Barth, director of digital marketing for IXL Learning, which owns the children’s math sites that showed the Ashley Madison ads, wrote in an email that the sites are flagged to Google as “child-directed” and run ads that comply with COPPA. When provided with images of the ads that Acar found, Barth said, “This is concerning. . . . I plan to investigate this issue.” The sites are no longer online.

Ashley Madison, meanwhile, expressed frustration that some of its ads had reached audiences unlikely to be in the market for its core service of helping people find sexual relationships outside their marriages.

“Unless there’s been a sudden surge of affairs at recess, this is not in our interest,” said Paul Keable, chief strategy officer for Ashley Madison’s parent company, ruby Inc.

Prohibitions have little effect

COPPA passed with bipartisan support in 1998 before much of today’s digital ecosystem existed. Google had just been founded, but many staples of contemporary online life — the iPhone, YouTube, Facebook — had not yet been invented.

The law sought to protect children from websites that targeted children, gathered their personal data and delivered advertising based on this information. This became illegal under COPPA, unless parents specifically permitted it. Few did.

But the legislation’s sponsors, who negotiated against powerful industry interests while seeking support in Congress, agreed to a key loophole: So long as online sites didn’t explicitly target children and didn’t have “actual knowledge” that a particular user was younger than 13, COPPA’s restrictions didn’t apply.

“We saw COPPA as a first step,” said Kathryn Montgomery, a retired American University professor who lobbied for the legislation in 1998 and now is research director for the Center for Digital Democracy. “But it’s very alarming to see that this industry, particularly those who are targeting children and making a lot of money on children, is not taking into consideration the welfare of children.”

A similarly named law, the Child Online Protection Act, also passed in 1998, backed mainly by political conservatives as part of a broader campaign against pornography. The measure sought to keep children from seeing content deemed “harmful to minors” but immediately came under successful legal attack from civil liberties groups, which argued that it infringed on constitutionally protected speech. The law never took effect.

That left COPPA, with its focus on privacy alone, as the primary federal statute seeking to protect children online.

The early years of COPPA enforcement actions by the Federal Trade Commission focused on companies that knew children were using their services, often because the sites asked users for their ages. Ohio Art Co., the maker of Etch a Sketch, for example, reached a $35,000 settlement with the FTC in 2002 over allegations that its website was collecting personal data on children, including their birth dates. Sony BMG Music settled a $1 million case with the FTC in 2008 for allegedly collecting data, including birth dates, on young visitors to the fan sites of pop artists.

Lawyers for technology companies began warning their clients to avoid requesting such information and, instead, to write terms of service prohibiting users under the age of 13, legal experts say. Some of the online services that surveys show are most popular among younger children, including YouTube, Instagram and Snapchat, officially bar them from the platform.

Children’s advocates say such prohibitions, often explained in fine print rarely read by young users or their parents, do little to keep kids off these services, an assertion backed by survey data.

Common Sense, the advocacy group led by Steyer, found in 2017 that 83 percent of children ages 10 to 12 use at least one social media site. Their favorites were YouTube, Facebook, Instagram and Snapchat — all of which have terms of service barring users that young. A poll last year by the Pew Research Center echoed the finding, showing that even among kids 11 or younger, 81 percent had watched YouTube at least once and 34 percent did so regularly, according to their parents.

Many of the most popular channels on YouTube, as ranked by research firm Social Blade, appear to be directed toward kids, by featuring nursery rhymes, simple cartoons or videos of children playing games or unwrapping packages with toys inside. Such issues, which have been evident for several years on YouTube, are at the heart of last year’s COPPA complaint to the FTC.

Enforcement is slow, inconsistent

Epic Games, maker of the popular online game Fortnite, has taken a legal position similar to that of the social media sites. It does not specifically prohibit users younger than 13 but says that it does “not direct” its games toward children or “intentionally collect personal information from children,” according to the company’s privacy policy. Many parents, however, say that children in elementary and middle schools spend hours a day playing the game, which collects a range of personal data from its users.

“COPPA doesn’t work,” said Marc Groman, a former White House and Federal Trade Commission lawyer who explored the law’s many problems in a podcast he co-hosts, called “Their Own Devices,” on children and technology. “The fact is it doesn’t protect children under 13 in many circumstances.”

Epic and Snapchat declined to comment. Newberry, the Google spokeswoman, said of YouTube, which is a subsidiary of Google, “Parents want their children to be safe online and we’re committed to providing tools and safeguards to help them.”

Instagram spokeswoman Stephanie Otway said: “People under the age of 13 are not allowed to use Instagram. When we find an underage account, we will restrict access to that account and ask the account holder to prove their age. If they are unable to do so, we will delete the account from Instagram."

Instagram declined to say how many accounts it has deleted for this reason.

Even when COPPA does apply, enforcement of the law often is slow and inconsistent, consumer advocates say.

The FTC in February fined the maker of the popular lip-syncing app TikTok $5.7 million, a record for COPPA violations, settling allegations that the app had collected personal data on children illegally since 2014 under a previous name, Musical.ly. But, critics of COPPA enforcement said the agency moved much too slowly to discourage other companies from doing the same. TikTok’s Chinese parent company, ByteDance, which acquired Musical.ly in 2017, is now one of richest start-ups in the world.

The FTC, which enforces COPPA, defended its record and said the law has been successful in protecting children from collection of their personal data and targeted advertising. The agency has settled allegations of COPPA violations in 30 cases and used its rulemaking power to update enforcement of the law in 2013 by expanding the definition of personal data to include videos, photos or voice clips.

“We think that the statute is effective, and we think that our enforcement efforts are effective,” said Andrew Smith, head of the FTC’s Bureau of Consumer Protection.

A push to strengthen

Sen. Edward J. Markey (D-Mass.), one of the original sponsors of COPPA, has proposed a bill this year that would strengthen it. His COPPA update, co-sponsored by Sen. Josh Hawley (R-Mo.), would set a broader standard requiring compliance if there is substantial evidence that children are using a website or app. The bill also would bring the United States closer to the children’s privacy standards in the European Union by raising the age of those covered by COPPA to include anyone younger than 16. European law prohibits collecting personal data from children younger than 16 in most cases.

“We believe that parents in the United States want the same protection for their children as Europeans want for their children,” Markey said.

He also is writing a bill that would implement new standards for children’s online content, echoing previous generations’ rules for kids’ shows on broadcast television. Markey said the portability of mobile devices makes it harder than ever for parents to monitor what their children are watching — or receiving through advertising.

Although Markey predicted bipartisan support for his bills, they already are generating resistance from some in the technology industry, whose lobbying corps is among the largest and best funded in Washington.

Markey’s legislative push, they warn, will inhibit innovation and push developers toward offering fewer free services online, limiting access to poorer families.

“I think it will push us further in this direction of limiting the availability of services and apps [for] kids,” said Daniel Castro, the vice president at the Information Technology & Innovation Foundation, whose board includes Apple and Microsoft executives. “It’s going to raise the cost of compliance. It’s going to make it so you have more paid apps.”

World health officials take a hard line on screen time for kids. Will busy parents comply?

‘The violations are rampant’

When researchers from the University of California at Berkeley tested 5,855 apps marketed to children and their parents on the “Designed for Families” portion of Google’s Play store, they found that 57 percent showed signs that they may be violating COPPA, including its limits on collecting the personal data of young users. The researchers published their findings in a peer-reviewed journal last year and furnished the list of apps to Google, hoping it would address the claims.

A year later, many of the identified apps still are available on the Play store in the “Designed for Families” section, the researchers say. New Mexico Attorney General Hector Balderas (D) has sued Google in federal court for alleged COPPA violations, including for not addressing problems the researchers found.

Google also was among the companies whose online trackers were probably collecting children’s data. Independent developers installed trackers from Google and other advertising companies into their apps, allowing them to collect such data as user locations and interests, based on what apps they used or websites they visited.

This helped app makers understand their audiences better while also providing the data necessary to attract more lucrative targeted ads, but on children’s apps, this kind of tracking probably violated COPPA, said Serge Egelman, one of the researchers and director of the Berkeley Laboratory for Usable and Experimental Security.

“The platforms have an incentive to not investigate,” Egelman said. “The violations are rampant.”

Consumer groups accuse Amazon of illegally collecting data on children

Google has argued in its response to the New Mexico attorney general’s lawsuit that “the app developer bears sole responsibility for ensuring COPPA compliance.” But the FTC also has broad authority to investigate and institute enforcement actions when companies engage in “deceptive practices.”

Angela J. Campbell, a law professor at Georgetown University who represents a coalition of consumer advocates and privacy groups that filed an FTC complaint in December against Google for alleged COPPA violations, argues that the company should not characterize apps as family friendly when company executives have been alerted to apparent privacy intrusions.

“They are deceiving the public in presenting these apps as being appropriate for children,” Campbell said.

Newberry, the Google spokeswoman, said, “Parents want their children to be safe online and we work hard to protect them.” She added that the “Designed for Families” section of the Play store requires developers to have privacy policies and to adhere to privacy laws, including COPPA. The Play store added rules last month requiring app developers to declare the ages of their intended audiences and to ensure that apps geared toward adults aren’t designed in ways that “unintentionally” appeal to children.

Newberry did not comment on Egelman’s research, on the complaint to the FTC alleging deceptive practices and why apps Egelman had identified as probably violating COPPA remained in the Play store’s “Designed for Families” section.

Facebook also collected data from hundreds of apps marketed to children and their parents, the researchers found. When they alerted the company that app makers appeared to be using Facebook tracking technology to collect information on children, Facebook contacted the app makers about the apparent COPPA violations.

But in dozens of cases, according to Egelman, apps that appear to be aimed at a children’s audience still are sending personal data to Facebook. That includes games that have cartoonish graphics and, in some cases, the word “kids” in titles or descriptions.

Facebook said it has limited power to evaluate potential COPPA violations when other companies use its tracking technology.

"We require all developers engaging with our platform to be COPPA compliant. We followed up on the researcher’s report and suspended all apps that did not guarantee their compliance. In order to protect a competitive developer ecosystem, the law does not give large companies the authority to determine COPPA compliance,” said Antigone Davis, Facebook’s head of global safety.

The state of COPPA enforcement is such that even the federal government has struggled to consistently comply. Researchers from the Danish privacy compliance company Cookiebot discovered recently that a site for the U.S. Government Publishing Office aimed at children as young as 4 appears to have violated the law’s prohibition on tracking children.

The site — called Ben’s Guide and featuring a cartoon image of Founding Father Benjamin Franklin — has games and lessons about U.S. history and government explicitly aimed at schoolchildren. It also had a Google data tracker on it that collected information about what devices children were using and what other websites they visited, Cookiebot found.

The Government Publishing Office placed the Google tracker on the site but did not collect the resulting personal data; the data went to Google.

“We do our best to ensure compliance with COPPA and do periodic reviews of the software,” said Gary Somerset, a spokesman for the Government Publishing Office. “If there are any system modifications needed to improve upon security or privacy, our technical personnel will correct.”

Google initially declined to comment. But after this story appeared online, Newberry, the company spokeswoman, said of the tracker on the Ben’s Guide site, “This is a service that’s set up by a webmaster or publisher, and they must take steps to deploy the tag on each page they want to measure.”

Tony Romm, Alice Crites, Julie Tate and Emily Guskin contributed to this report.

Dig Deeper: Personal tech + Privacy

Want to learn about how to keep your personal information private? Check out our curated list of stories below.

Get smart about stopping spam calls

Americans receive more than 5.2 billion automated calls in a month. But there are new apps to help stem the deluge.

Understanding what your phone tracks for marketers

As tech columnist Geoffrey Fowler slept, a dozen marketing companies used his iPhone to learn his number, email, location and IP address.

Say no to your default privacy settings

Changing privacy default settings means you’ll get less personalization from some services, but it can slow down the number of eerie on-the-nose ads driven by data siphoned by major companies.