By Dina Alexander, MS
Over the past 15 years, child-safety experts and advocates have urged big tech companies to build safer platforms and design apps with kids in mind from the start. We’ve asked for accurate age ratings, meaningful warnings, stronger parental oversight, and real accountability. Yet despite polished safety messaging, children still encounter porn, self-harm content, predatory chats, and manipulative in-app purchases across games, video platforms, and chat apps.
Some lawmakers have tried to create “social media bills” or “sensitive content bills” to protect kids. While these efforts are often well-intentioned, social media companies have some of the most powerful legal teams in the world and frequently challenge such laws under the First Amendment. In many cases, courts side with the platforms, leaving states with limited tools to address the real harms children experience online.
Even before they are passed, these platform-by-platform laws face massive opposition from Big Tech trade groups and lobbyists, and they rarely address the broader concerns parents and teachers actually have.
But we are now at a point where parents are fed up and tired of waiting and only hearing false promises, manipulations, and glossy public-relations campaigns labeled “child safety.” Many of us are genuinely worried about what this digital environment means for our children’s future.
Out of our desperation and frustration, parents are looking for and demanding a stronger, broader, and more common sense approach to saving our children’s mental health. One that puts children’s wellbeing ahead of corporate interests and finally gives families the transparency and protections they deserve.
After years working in child online-safety advocacy, I’ve come to see that the real leverage point is the app stores themselves: the gatekeepers that decide what reaches our children’s devices (Sekulow & Parsons, 2026).
That’s why the App Store Accountability Act matters.
What parents are up against
The gatekeepers set the rules. Apple and Google control which apps appear, how they’re rated, and how purchases work. Because they profit from downloads and in-app spending, teens often gain access to apps with ratings that don’t reflect real risks.
Ratings and warnings are inconsistent. Many apps labeled for 13+ still include mature content or risky features, and ratings can change without parents ever being notified.
Controls are hard to manage. Tools like Screen Time or “Ask to Buy” only work if every setting is perfect, and kids quickly learn workarounds. These settings often require 20+ steps, and even then, many parents report that they are still not set up correctly.
And the risks go far beyond social media. Games, livestreaming platforms, chat communities, and even productivity apps can expose kids to strangers or harmful content.
Why app-store accountability works
Regulating one app at a time is like chasing shadows. App stores are the front door to everything. When safety rules apply at the store level, every app has to meet a higher standard before it reaches families.
What the App Store Accountability Act typically requires
Independent, consistent age ratings: Clear, audited age ratings and content descriptions with visible risk labels. Parents are notified when an app’s rating, data practices, or core features change.
Default parental oversight for minors: Verified parent approval before minors download apps or make purchases, with simple tools to manage approvals across devices.
Protection from manipulative design: Limits on dark patterns, confusing subscription flows, and high-pressure monetization aimed at kids, plus clear refund rights for unauthorized purchases.
Developer accountability: Verified developer identities, clear contact points, and escalating consequences for repeat violations involving minors.
Why “social-media-only” bills fall short
Will this break innovation?
No. Safety baselines don’t stop creativity. They simply require honest labels, parental consent for minors, and transparent purchase systems (standards responsible developers already follow).
Consistent enforcement actually helps high-quality apps compete fairly.
What parents and child safety advocates can do now
Ask your legislators to support the App Store Accountability Act in your state or at the federal level. Share real experiences with misleading ratings or surprise subscriptions. And keep using practical tools at home, but remind policymakers that families shouldn’t have to do this alone.
Protecting kids is best done at the gatekeeper level. When app stores are accountable, every app a child can download must meet higher, audited standards.That’s a smarter and more durable path forward, and it finally puts children’s wellbeing ahead of app-store profits.
References:
McKay, M. (2026, January 8). Grok and the gatekeepers who looked away. Digital Childhood Alliance. https://www.digitalchildhoodalliance.org
Sekulow, J., & Parsons, D. (2026, March 2). The gateway to tech is the app store – that’s where reform must begin.American Center for Law & Justice. https://aclj.org/free-speech/the-gateway-to-tech-is-the-app-store-thats-where-reform-must-begin
