Regulating minors’ access to social‑media platforms has become one of the most active—and contested—areas of technology policy in the United States. Legislators see rising youth screen‑time, sextortion schemes, and algorithm‑driven “infinite scroll” feeds as threats to children’s well‑being, while parents struggle to monitor online activity that increasingly shapes their kids’ social lives. The accompanying video features Professors Leah Plunkett of Harvard Law School and Naomi Cahn of the University of Virginia School of Law, who outline where the law now stands and where it may be headed.
States are functioning as laboratories. Florida and Utah have enacted the most sweeping measures—complete bans on accounts for children under 14 and parental‑consent requirements for 14‑ and 15‑year‑olds, coupled with mandatory deletion of existing under‑age accounts. Mississippi’s similar statute survived an initial injunction in April, 2025. Other jurisdictions pursue narrower interventions: New York’s “SAFE for Kids Act” would bar so‑called “addictive feeds” for minors absent verified parental approval, while Colorado requires pop‑up prompts after an hour of continuous use and pushes digital‑literacy instruction in public schools.
Most of these laws turn on age verification and parental permission, but the mechanics vary. Indiana authorizes third‑party credential checks (e.g., driver’s‑license scans) to gate “adult‑oriented” sites, whereas Florida relies on private enforcement and a new civil remedy for minors. Design‑based rules are appearing as well: limits on late‑night push notifications, default disabling of autoplay, and algorithmic controls intended to keep self‑harm or bullying content out of personalized feeds. Each approach must reconcile three sets of interests—children’s First Amendment rights, parents’ constitutional authority to direct upbringing, and the public interest in child safety.
That balance is already being tested in court. The Supreme Court’s Reno v. ACLU decision (1997) struck down parts of the Communications Decency Act for burdening adult speech online; today’s challenges make similar overbreadth arguments. Because any speech restriction aimed at minors can incidentally sweep in adults, courts apply strict scrutiny, asking whether a measure serves a compelling purpose and is narrowly tailored. Age‑gating schemes also raise privacy concerns: a requirement to upload government IDs at every log‑in could chill lawful adult activity.
At the federal level, the Children’s Online Privacy Protection Act (COPPA)—enacted in 1998, effective in 2000—remains the only comprehensive statute. COPPA bans data collection from children under 13 without “verifiable parental consent,” but it covers neither teens nor passively gathered behavioral data, and it lets platforms plead ignorance of a user’s age. The FTC issued a long‑awaited rule update in the spring of 2025, but Congress has yet to pass broader proposals such as the Kids Online Safety Act (KOSA) or the Protecting Kids on Social Media Act. Several state bills explicitly cite the “federal vacuum” as a reason to act.
Legislatures are also addressing children as content creators. Illinois, Minnesota, Utah and California have adopted “digital Coogan” statutes that require a percentage of influencer income to be held in blocked trusts and, in some states, give minors a right to have videos deleted when they reach adulthood. A Uniform Law Commission drafting committee—on which Professor Plunkett serves as reporter—is preparing a model act to harmonize financial and privacy protections for child digital entertainers nationwide.
For now, the regulatory landscape remains fluid. More lawsuits are certain, and the Supreme Court is likely to be asked whether these laws impermissibly burden speech or whether child‑protection goals justify narrowly drawn restraints. At the same time, some platforms are voluntarily introducing stricter teen privacy settings, bedtime notification pauses, and easier parental dashboards—signals that business practices may evolve alongside, or in place of, formal mandates. Whatever the eventual shape of federal or state rules, the debate has shifted: protecting children online is no longer treated as a niche concern, but as a mainstream legal issue that touches constitutional doctrine, family autonomy, and the design choices of the world’s largest technology firms.