
A bipartisan Senate bill would require pop-up mental health warnings every time a minor opens TikTok, Instagram, or YouTube. Minnesota's version kicks in July 1.
The Stop the Scroll Act, sponsored by US Senators Katie Britt and John Fetterman passed the Senate Commerce Committee on April 14. If it becomes law, social media platforms would have to display a pop-up warning each time a user under 18 opens the app. The user would need to acknowledge the warning before they can keep scrolling. The Surgeon General would determine the exact wording, and the FTC would handle enforcement.
The idea is not new, but it seems to have fresh momentum. Minnesota already passed a similar law that takes effect on July 1. New York signed its own version in December, specifically targeting features like infinite scroll, autoplay, and algorithmic feeds on platforms like TikTok, Instagram, YouTube, and X (Twitter). Ohio, Colorado, and California have their own versions too.
How the labels would work
The federal bill would require a warning label designed by the Surgeon General to appear in a pop-up format every time a minor accesses a social media platform. It would include links to mental health resources, including the 988 Suicide and Crisis Lifeline. Platforms would not be allowed to hide the warning in their terms of service or let users permanently dismiss it.
Minnesota's version is slightly different. The warning shows up for all users, not just minors, and it stays visible until you either exit the app or acknowledge it and choose to proceed.
New York's law is more specific about what triggers the label. It targets platforms that offer addictive feeds, autoplay, infinite scroll, or like counts. If a platform buries the warning or obscures it, they face a penalty of up to $5,000.
Why this is happening now
The numbers are hard to ignore. According to the U.S. Surgeon General's 2023 advisory on social media and youth mental health, about 95% of teens aged 13 to 17 use social media. The Surgeon General's advisory also found that teens spending more than three hours a day on social media are twice as likely to experience symptoms of depression and anxiety. And a 2025 Pew survey found that 45% of teenagers themselves now say they spend too much time on these platforms.
But will warning labels actually help? I think the intent behind these bills is decent enough. But I’m not sure a pop-up warning is going to keep a 14-year-old from opening TikTok. We’ve had warnings on cigarette packs for decades, and while they have helped with awareness, it took actual restrictions on purchasing and advertising to move the needle.
I don’t love the idea of outright bans or censorship when it comes to the internet. But when you look at the data, unfettered access to social media is clearly not healthy for teenagers. At some point, something like age-gated restrictions or time limits baked into the platforms themselves might do more than a label that kids will just tap through without reading. Australia is already trying an under-16 ban. I think the US will end up somewhere in the middle, though what that middle ground solution is we will have to wait and see.
Read More: Deepfake scams are flooding social media. This tool catches them.
[Image credits: Bastian Riccardi/Pexels]