Age Verification for What?
Surface-level similarities between age verification laws can conceal deeper differences.
I’m not the first to say it, but “the think tank ecosystem can be an echo chamber.” Depending on which echo chamber you prefer, you can easily find think-tank experts who reflexively support—or reflexively oppose—age verification. I don’t fit into either group, but I’m an engineer, not a think-tank expert. I supported age verification for porn sites, but I also spotted a critical flaw in age verification laws for social media.
When the Free Speech Coalition sued to block Texas’s age verification law for porn sites, the Supreme Court upheld that law in Free Speech Coalition v. Paxton (2025). When NetChoice sued to block Mississippi’s age verification law for social media, though, Justice Kavanaugh—who was part of the majority for FSC—wrote a brief opinion suggesting that this law was unconstitutional.
Now, states have passed age verification laws for app stores: the App Store Accountability Act. And more court battles are inevitable. Who will win this time? Here, I’ve read quite a few polished policy pieces that feel more like a pep rally for one side; a mock battle is more my style. In a court battle, my role would be a technical expert, so I’ve written a mock expert declaration in opposition to this law.1
1. When NetChoice sued to block Mississippi’s age verification law for social media, the Fifth Circuit remarked, “This case continues our struggle with the interface of law and the rapidly changing universe of technology.”
2. In this case, comparisons to Free Speech Coalition v. Paxton (2025) are inevitable. Such comparisons, though, again bring this Court back to “the interface of law and the rapidly changing universe of technology.” On the technical side, what similarities—or differences—would be found in the factual records for both cases?
3. On the surface, the factual records seem similar. Age verification technology today would be similar to (if not better than) age verification technology at the time of FSC. The factual record is incomplete, however, unless it also considers the requirements of the age verification law.
4. Compare two software features: finding nearby restaurants and finding nearby friends. They seem similar on the surface, but restaurants rarely change their location, while friends frequently change their location. Unlike nearby restaurants, software to find nearby friends has this requirement: it processes over 100,000 location updates per second.2 Likewise, the App Store Accountability Act (“the Act”) imposes materially different requirements, compared to the law in FSC (“H.B. 1181”).
Comparing the Basic Requirements of Both Laws
NOTE: Since there are multiple versions of the App Store Accountability Act, I’ve focused on the requirements that are common to most, if not all, versions of this act.
5. Based on my understanding of H.B. 1181, it contains these requirements:
Pornographic sites must block users who are under 18.
Pornographic sites must verify each user’s age category: under 18, or 18 and older.
6. Based on my understanding of the Act, it contains these requirements:
App stores may allow users who are under 18, but they must obtain parental consent for each app that an underage user wants to download.
App stores must verify each user’s age category: under 13, 13 to 15, 16 to 17, or 18 and older.
App stores must verify parental consent for minors.
Under 18 vs. Granular Age Categories
7. Facial age estimation is a modern method of age verification. To explain how it works, a webcam captures a short video (or an image) of one’s face. An algorithm uses this video to estimate the person’s age. Once that estimate is produced, the video can then be deleted, alleviating privacy concerns. This is known as a biometric method of age verification.
8. Suppose that an engineer was designing an age verification system for pornographic sites; the system only needs to verify if a user’s age is under 18, or 18 and older.
9. An engineer may look at using facial age estimation. This solution would produce the correct age category for most users—though a secondary method would be needed for 18- to 20-year-olds. In its whitepaper on facial age estimation, Yoti reports that it can reliably determine that 13- to 17-year-olds are under 21; the accuracy is 99.3%.
10. According to the Age Verification Providers Association’s amicus brief in FSC, “while biometric age verification cannot perfectly identify a user’s age, it effectively waves in the vast majority of users who are well over 18, leaving potential doubts only as to those between 18 and 21.”
11. But suppose that the requirements are changed: the system must now verify if a user’s age is under 13, 13 to 15, 16 to 17, or 18 and older.
12. Facial age estimation is not perfect. If the margin of error is ±1 year, a user estimated to be 15 could actually be 14 or 16; 16 is a different age category. Per Yoti’s whitepaper, their facial age estimation has a mean absolute error of 1.1 years for 13- to 17-year-olds.3 For many minors, the margin of error would cross the boundaries of an age category.
13. In the offline world, people often verify their age with a driver’s license, so an engineer could try that option next. One age category, though, is 13 to 15. Many users in that category do not have a driver’s license (or even a learner’s permit).
Blocking Underage Users vs. Allowing Underage Users with Parental Consent
14. Suppose that a new requirement is added: the system must verify parental consent for users who are under 18. The key challenge here is verifying the parental relationship between an adult and a minor. (Once that relationship is established, verifying parental consent is straightforward.4)
15. Facial age estimation can only verify an age, not a parental relationship. Likewise, a government ID can only verify an identity and an age, not a parental relationship.5
16. A birth certificate, by contrast, does list a parent, but it can only verify who has custody at the time of birth. Custody can change, for example, if the child is adopted—or for minors with divorced parents, minors in foster care, and emancipated minors.
17. In his testimony for Arkansas’s age verification law, Tony Allen of the Age Verification Providers Association observed that the “biggest challenge” is establishing the parental relationship: “It’s easy to say that this person who is giving the consent is, let’s say, in their 40s, versus the person that’s asking for the consent being under 18. But actually establishing that that is a parent or a legal guardian, that’s the challenge with those processes.”
18. Age is a self-contained fact. I understand that a parental relationship, though, is a government-dependent fact; for example, the government decides who gets custody in a divorce, or whether an adult can adopt a minor. When an engineer designs a system that verifies a parental relationship, navigating custody laws goes beyond the means of what an engineer can do.
The Scope and Scale of the Act
19. Verifying age or parental consent is a means to an end. And I understand that the requirements of both laws differ as to their ends (not just their means). The scope of the Act is not limited to pornographic apps.
Assessing the Scope and Scale
20. I understand that this Court is bound by Moody v. NetChoice (2024), which requires a two-step facial analysis: “The first step in the proper facial analysis is to assess the state laws’ scope.” Based on my understanding of the Act, the scope includes (but may not be limited to) every app that is downloaded from Apple and Google’s app stores.
21. The scale here is defined by both volume and variety. In terms of volume, Apple reported in 2022 that its app store had nearly 1.8 million apps. Estimates indicate that Google’s app store also has millions of apps. In terms of variety, Wired published a story in 2010 about an Apple slogan “so catchy that it's endlessly parroted by the media”: “There’s an app for that.” One TV ad featured an app to check snow conditions on the mountain, an app to count the calories in your lunch, and an app to find where you parked your car.
22. Per Moody, “The next order of business is to decide which of the laws’ applications violate the First Amendment, and to measure them against the rest.” In terms of volume, analyzing millions of apps on an individual basis could go beyond this Court’s capabilities.
23. In terms of variety, while many pornographic sites exist, one can make generalizations that apply broadly to all pornographic sites. An app ecosystem where apps can do just about anything, however, is also an ecosystem that resists meaningful generalizations.
24. It may be possible to anecdotally identify apps that are inappropriate for minors. But even assuming arguendo that the Act is constitutional as applied to these apps, the plural of anecdote is not data. Measuring an ecosystem containing millions of different apps—including both constitutional and unconstitutional applications—is a very different task.
Managing Complexity and Scale
25. At 65 MPH, can a car drive 300 miles from St. Louis to Chicago in 4.5 hours? Consider real-world conditions: traffic, red lights, and roads with speed limits under 65 MPH. The answer is no. Even under ideal conditions—no traffic, no red lights, and speed limits of 65 MPH or greater—the drive would take over 4.5 hours (300 miles / 65 MPH ≈ 4.62 hours).
26. Does the Act’s plainly legitimate sweep include social media apps? This Court may have to confront such constitutional questions not just for social media apps, but for many types of apps. Here, this Court could apply Moody’s framework under ideal conditions for the State—assuming arguendo that the Act’s legitimate sweep includes social media apps.
27. Suppose that—even under ideal conditions for the State—the number of unconstitutional applications grows beyond what Moody allows. In that case, this Court need not consider real-world conditions, such as deciding whether the Act is constitutional as applied to social media apps.
28. Still, analyzing this ecosystem can be difficult without concrete examples. Social media—which relies on user-generated content—is one example. This ecosystem also includes apps that rely on curated third-party content, such as the collection of movies on the Netflix app, and apps that rely on first-party content, such as The New York Times app.
29. Apps that rely on user-generated content do not just include social media apps. They also include user-generated encyclopedias, such as Wikipedia’s app, and user-generated reviews, such as Yelp’s app.
30. This ecosystem includes non-violent video games such as Solitaire, Candy Crush, and Angry Birds. It includes Bible apps and other religious apps. And it includes an app to check snow conditions on the mountain, an app to count the calories in your lunch, and an app to find where you parked your car.
Sharing Age Data with Apps: Cybersecurity and Privacy Risks
31. Based on my understanding of the Act, it contains these requirements:
App stores must share the (verified) age category of a user with apps. For minors, app stores must also share the parental consent status.
Apps must verify the age category and parental consent for users, using the data provided by app stores.
The “Zero Trust” Mindset
32. A core cybersecurity challenge is the defender’s dilemma: attackers can strike from anywhere using any method. If hackers are trying to penetrate a corporate network, they could exploit a security vulnerability, or they could embed malware in a PDF and email it to any employee. For the attackers to succeed, the defenders only need to fail once.
33. If the corporate network is a castle, attackers often infiltrate this castle. John Reed Stark, former chief of the SEC’s Office of Internet Enforcement, once said, “Cybersecurity is an oxymoron and a data breach is inevitable.” Thus, the industry developed a “zero trust” paradigm: don’t assume it’s safe inside the castle.
34. If the app store is a castle, both Apple and Google have an app review process to ensure the land inside their castle is safe. This process is not perfect. In 2024, a fraudulent app impersonating LastPass, a popular password manager, was discovered in Apple’s app store. In 2025, the Tea Dating Advice app suffered a data breach that exposed users’ selfies and driver’s licenses.
35. The defender’s dilemma also applies to app stores, but Apple and Google must defend a landscape with millions of different apps—a landscape much larger than a corporate network. And inevitably, bad apps often find ways to infiltrate the castle. Thus, the same “zero trust” principle applies: don’t trust an app just because it’s inside the castle walls of the app store.
Best Practices for Sensitive Data
36. For age data, a useful frame of reference is location data; both are sensitive data. The McDonald’s app wants your location to find nearby restaurants, but data brokers (companies that collect personal data and sell it) also want your location. And parents may object if a browser broadcasts their kid’s location to every site they visit. Thus, the tech industry converged on this principle: do not share location data without the user’s permission.
37. For websites, the World Wide Web Consortium (W3C), a standards body, published a geolocation standard that “requires express permission from an end-user before any location data is shared.” For apps, iOS, Apple’s operating system for iPhones, only shares location with an app with the user’s permission. Likewise, Android, Google’s operating system for smartphones, only shares location with an app with the user’s permission.
38. A similar use/abuse duality exists for age data. Apps can use it to enforce age restrictions. However, data brokers also want this data, and predators could use it to target children. In February 2025, Apple announced plans to share the age category of child accounts with apps.6 However, it would do so “if and only if parents decide to allow this information to be shared, and they can also disable sharing if they change their mind.”
39. Compared to Apple’s approach, the Act lacks a key privacy protection: it forces app stores to share age data with apps without a parent’s permission.
40. From a parent’s perspective, an app developer can reasonably be characterized as a stranger. And not every app developer may work for a reputable tech company. For a parent, the decision to let a child use an app is distinct from the decision to share a child’s personal data—including age data—with an app.
41. Age data can also be combined with other data that the app has already collected. For example, when the Act provides age data to apps that use location data, it introduces this risk: it tells strangers which users are kids, and where those kids are located.
Comparing Incentive Structures to Protect Privacy
42. Based on my understanding, the Act also forbids apps from sharing age data with unaffiliated third parties. In other words, the Act forces app stores to share age data with apps without asking a parent for permission, but then tells apps not to misuse this data.7
43. If an app misuses this data, someone must catch them. Here, app stores and law enforcement face a vast, ever-changing landscape with all three Vs of Big Data: volume, variety, and velocity. For volume and variety, millions of different apps exist. For velocity, many app developers regularly update their apps; app stores must review a steady stream of updates.
44. Even if an app is caught, the app developer, for example, could live in China and have ties to the Chinese military.8 In August 2025, security researchers at Comparitech found 10 VPN (virtual private network) apps that communicated with Russian domains and 6 that communicated with Chinese domains. Earlier research from the Tech Transparency Project traced multiple VPN apps back to QiHoo 360, a Chinese military company.
45. In FSC, I understand that the Court concluded that pornographic websites would “have every incentive to assure users of their privacy.” Here, though, it would not be prudent to assume that millions of different apps would all be incentivized to protect privacy—especially in the case of Chinese or Russian apps.
Data Minimization
46. Another privacy principle is to only collect or share data where there is a specific, legitimate purpose for using that data. (Nonetheless, collecting or sharing a large amount of data still presents privacy risks, even with a specific, legitimate purpose.)
47. Here, one such purpose is denying access to an app. But this purpose can be accomplished without sharing age data with apps. If the burden of verifying age and parental consent is shifted to the operating system (OS), the OS can block an app from starting if a parent blocks that app.910
48. Second, apps could build child safety features that depend on an age category. But not all apps may build such features. Even for those that do, Apple or Google may want to first verify that the app has a specific, legitimate purpose to use an age category. The Act, by contrast, forces app stores to unconditionally share an age category with every app.
49. There exists, however, a third purpose. I understand that some child safety regulations are triggered when a site or app has actual knowledge of a user’s age, such as certain provisions in the Children’s Online Privacy Protection Act.11 Such laws can create an incentive structure where sites and apps can reduce regulatory risk by avoiding actual knowledge of their users’ ages. Child safety advocates have often criticized the actual knowledge standard; some even claim it is “almost impossible to prove in a court of law.”
50. If an app verifies a user’s age, it then has actual knowledge of a user’s age.12 This is merely an idea, though. Legislators must still chart out a path to implement this idea.
51. The Act charts out this path. First, it places the burden of age verification on app stores. Then, it forces app stores to share that verified age category with every app. On the surface, this solution appears to work. Nearly every app has actual knowledge of its users’ ages.
52. Having considered how age data is used to create actual knowledge, the next step is to consider how this same data can be abused. Recall that apps should be untrusted by default; “zero trust” applies. In a world filled with hackers, data brokers, and predators, the Act forces app stores to share age data with millions of untrusted apps—without asking a parent for permission. This is a dangerous path.
Concluding Note
53. Justice Thomas—who authored the majority opinion in FSC—asked the first question of oral arguments: “Can age verification systems ever be found constitutional?” I offer no opinions on the constitutionality of age verification as an idea. In the context of a case or controversy, though, I can offer my expertise concerning an implementation of that idea. A different implementation with different requirements would be evaluated differently.
Since this is a mock declaration, I’ll focus on the core content and omit formalities like a fancy bio, proper formatting, and boilerplate legal verbiage. You can assume that these could easily be added in if this was a real declaration.
These examples were based on the first two chapters of System Design Interview, Volume 2. In the chapter on nearby friends, the estimate was that this system would process 334,000 location updates per second.
The margin of error is different than the mean absolute error. But if, for example, the mean absolute error is 1 year, the margin of error for a 95% confidence interval is probably significantly larger than ±1 year.
Likewise, parental controls like Apple’s Family Sharing and Google’s Family Link are relatively easy to build if you assume that the person who claims to be the child’s parent is in fact the child’s parent.
Some have suggested matching last names or addresses on IDs. However, last names may not match if the wife kept her maiden name. Minors without a driver’s license could use a passport, but passports do not have an address. The address on a driver’s license may not match if a family moves within a state.
To the best of my knowledge, this is a declared age; Apple does not verify this age.
I understand that the Act also requires that app stores encrypt this data before it is shared with apps. Encryption, however, only prevents a third party from intercepting this data. If the app store does not trust the app—at least until a parent grants permission to share age data—encryption cannot solve that problem.
The app developer would also have to be identified first. This process depends in part on what information an app store collects from developers, and what steps the app store takes to verify that information. Bad actors may provide false information.
On Android devices, Samsung and Epic Games would no longer need to verify age or parental consent for their app stores. Only Google would need to verify age and parental consent; it owns the Android OS.
An operating system may also need to share this data with an app store, but managing data sharing with a few app stores is much simpler than managing data sharing with millions of different apps.
I understand that these provisions in COPPA also apply to sites or services that are “directed to children,” where a child is “under the age of 13.” However, a site may state that users must be at least 13 years old, but then not verify age at sign-up.
However, users could substitute websites for apps in some cases. For example, they could use facebook.com instead of the Facebook app.