A 13-year-old isn’t legally old enough to drive, smoke, open a checking account or walk unaccompanied into an R-rated movie. A 13-year-old brain is at least a decade away from fully developing, yet the law says it’s a fine age to establish a vibrant online life, complete with personal data collection, exposure to explicit content and a frighteningly high potential of stumbling across sexual predators.
The law is the Children’s Online Privacy Protection Act of 1998, and in the words of Jim Steyer, founder of Common Sense Media, it’s “hopelessly outdated.” Congress needs to fix it.
The legislation was born in the infancy of the online world by lawmakers who saw the need to protect minors in the new digital frontier. COPPA, as the acronym goes, ensures personal data — including names, ages, preferences and addresses — for children 12 years and younger is protected. For online companies, simply offering their services to those 13 and up was easier than navigating parental permissions for children.
It doesn’t take a lawyer to spot the loophole. Scheming preteens or parents who don’t care for arbitrary age limits just input a false birthday while signing up for online services, and off they go. Surveys show at least half of youths younger than 13 use social media in some form, and the average age for signing up for a social media account is 12.6. “It’s led to millions of kids lying about their age online,” Steyer says.
The effect is injurious. Thirteen was never intended to be a de facto age of consent for the digital world. But, as the Wall Street Journal notes, parents have now raised a generation thinking the COPPA age limit functions in the same way as a movie rating — some apps and websites may have more mature content, they say, therefore 13 is an appropriate age to access them.
But that’s not true. Reddit, YouTube and Instagram open worlds of content unmoored from age restrictions. Snapchat is on the National Center on Sexual Exploitation’s Dirty Dozen watch list for its sometimes salacious “Discover” page. The Deseret News has highlighted incidents of sextortion where anonymous predators find minors on apps and threaten them with humiliation in exchange for cash or explicit photos. Some cases end in suicide.
And even if children play it safe, it’s hard to avoid adult-themed ads popping up where they shouldn’t. The Washington Post reports ads for Ashley Madison, an online dating site for adults seeking an affair, appeared on a string of children’s websites. Google, the ad’s distributor, acknowledged it violated the company’s policies.
A comprehensive solution would involve several parts. First, parents need more transparency about the apps their children access. In a Senate Judiciary Committee hearing on Tuesday, Utah Sen. Mike Lee agreed with media watchdogs that app descriptions are too generic and parental controls are too convoluted. A spokesman for The Church of Jesus Christ of Latter-day Saints, which owns this paper, also supported the idea.
But it would be foolish to rely only on app developers and app stores to communicate hazards for youths. Congress needs to fill the gaps.11 comments on this story
Lawmakers should raise the COPPA age limit to something more reflective of society’s treatment of teenagers, say 16 — the age we entrust teens with a vehicle. It would be hopeless to expect 14- or 15-year-olds to suddenly quit social media, but it would update the framework under which noncompliant companies could be punished. Additionally, lawmakers should close loopholes allowing companies to collect data on minors who use a false date of birth.
The judicial system, too, needs to be more consistent in applying the law and challenging violators.
And ultimately, it’s parents who must bear the burden of monitoring their children and making prudent decisions about access to digital content.
Safeguarding children from unintended consequences has been a consistent focus of these pages, and the internet is no exception. Kids aren’t ready to be adults, and they shouldn’t navigate an online world as if they were one.