Holding the Innocent Accountable
"Hold Big Tech accountable" is a slogan about punishing the guilty. It does not excuse punishing the innocent.
Why do you need to verify your age before you can download a Bible app? Sure, app stores could do more to protect kids, but does that really justify a bill—the App Store Accountability Act—that also “protects” you from a Bible app?
Conduct, Not Content
“Does this bill violate the First Amendment?”
When a legislator proposes the App Store Accountability Act—or really any bill to protect kids online—a small army of experts (or “experts”) will often show up at their door, often armed with arguments that their bill violates the First Amendment.
But while the legislator justifiably expresses skepticism towards these hostile experts, the experts then add that the courts have sided with them. NetChoice (a trade association for Big Tech) has sued to block age verification laws for social media in Arkansas, Ohio, Mississippi, and Utah. In all four states, the judge sided with them.
Thus, the legislator begins to question whether their law is unconstitutional. That question is too imprecise for my tastes, though. I would instead ask more precise questions—ones that are akin to what a practicing lawyer would ask.
First, what level of scrutiny applies: rational-basis review, intermediate scrutiny, or strict scrutiny? Second, does the law survive that level of scrutiny?1
And since we’re predicting what courts would do in the real world, let’s examine a real-world case: TikTok v. Garland (2025). Here, the Supreme Court considered a law that forces China to divest TikTok—and bans TikTok if China does not divest.
For many experts, their time to shine had come. On one side, Jennifer Huddleston of the Cato Institute confidently proclaimed that the law was unconstitutional under strict scrutiny. On the other side, Joey Thayer of the Digital Progress Institute confidently proclaimed that there are “no [First Amendment] concerns here,” as “[t]he bill regulates TikTok's conduct, not content.” (For what it’s worth, this engineer predicted that courts would uphold the law under intermediate scrutiny.)
Whose elaborate legal theories would survive contact with reality? When the Supreme Court handed down its unanimous decision, it upheld the law under intermediate scrutiny—a standard used when some First Amendment concerns exist.2
What about the App Store Accountability Act? Friendly experts have declared that this bill regulates conduct, not content—describing it as a bill that regulates contracts with minors, not as a content bill. I can spot the logical fallacy there.
That argument creates a false dichotomy. It suggests that we can cleanly separate laws into conduct laws—which are subject to rational-basis review—and content laws—which are subject to strict scrutiny. In the real world, however, many laws fall somewhere in between a conduct law and a content law.
The true choice courts must make is not between strict scrutiny and rational-basis review, but between strict and intermediate scrutiny. As the Supreme Court said in Moody v. NetChoice (2024), “In the usual First Amendment case, we must decide whether to apply strict or intermediate scrutiny.”
If an age verification law touches social media, it probably qualifies as the “usual” case. The App Store Accountability Act touches social media apps (and other apps).
Of course, some experts—armed with motivated reasoning—will then argue that this law is the “unusual” case. (It’s not.) Ohio already tried to claim that age verification laws are contract laws, not content laws. Judge Marbley rejected that claim: “this Court is unaware of a ‘contract exception’ to the First Amendment.”
And if these experts still won’t back down, I would then ask this: what if this was the usual case? Can this act not even survive intermediate scrutiny?
If the App Store Accountability Act could survive intermediate scrutiny, then perhaps there’s little harm in shooting your shot for rational-basis review. But if the act can’t survive intermediate scrutiny, then your entire legal argument depends on the courts applying rational-basis review. Engineers would call that a single point of failure.
And if that single point of failure collapses if the courts decide that this is the usual case where rational-basis review does not apply, then the act is poorly engineered.
Thus, going forward, we will assume that this is the usual case where “we must decide whether to apply strict or intermediate scrutiny.”
Holding Dreamwidth Accountable
In many cases, age verification bills easily pass the legislature, but fail to survive contact with reality in the courts. Before we jump into the App Store Accountability Act, we should first study those failures in the realm of social media.
After Utah passed its age verification law for social media, NetChoice sued Utah in December 2023—and the Foundation for Individual Rights and Expressions followed suit in January 2024. NetChoice had already convinced Judge Brooks to block a similar law in Arkansas in August 2023. After they sued Utah, NetChoice also convinced Judge Marbley to block a similar law in Ohio in February 2024.
Needless to say, Utah legislators could not ignore the legal threat from NetChoice as they contemplated their next steps. Friendly experts, led by the Institute for Family Studies (IFS), published a coalition letter in February 2024 with this message: “Don’t back down now.” “Utah is retreating at the very moment of Big Tech’s vulnerability.”
Regarding the legal concerns, they wrote, “attempts to forecast what . . . will survive judicial review are highly speculative.” They declared that Utah’s law is “not a restriction on speech (though Big Tech lobbyists have been arguing otherwise),” as it “regulates minors’ right to contract for certain goods and services.”
(This letter was published after Judge Marbley in Ohio had written that “this Court is unaware of a ‘contract exception’ to the First Amendment.”)
Meanwhile, I sought to study the words of Judge Brooks and Judge Marbley. Even if you do not agree with them 100%, you still must respect their authority as judges. And, I suspected that those laws may have been poorly engineered.
Eventually, I zeroed in on one key flaw: a bad definition of social media. The Internet contains over one billion websites; how do you accurately classify each one as “social media”/“not social media”? Speaking from experience, it’s not as easy as it seems.
As I warned in City Journal: “Legislators in both parties may want to hold Big Tech accountable, but their reforms will go for naught unless they sweat the details of their definition.” Specifically, I focused on three common anti-patterns: content-based exceptions, overinclusive definitions, and vague definitions.
In some circles, the reaction to my fixation on the definition was not exactly positive. Was this a pointless “crusade”? In other cases, this fixation was just ignored.
Meanwhile, NetChoice was litigating social media laws in Mississippi and Texas. Both states followed a familiar pattern with their definition. The first part of the definition was overinclusive; it included sites that weren’t social media. Thus, the second part of the definition contained exceptions that would exclude those sites.
NetChoice, however, argued that some exceptions were content-based. Judge Ozerden in Mississippi and Judge Pitman in Texas both agreed, blocking those laws, respectively, in July and August 2024. That same objection—content-based exceptions—had also previously persuaded Judge Brooks and Judge Marbley.
What goes wrong when a definition is “content-based”? Content-based laws are subject to strict scrutiny (while content-neutral laws are only subject to intermediate scrutiny). And strict scrutiny tends to be “strict in theory, fatal in fact.” In short, content-based exceptions tend to be fatal in fact—and were fatal in four states.
To return to Utah, in reaction to NetChoice’s legal threat, legislators decided to stay the course, but they also could not walk the exact same path. And although they acted before the court decisions in Mississippi and Texas came down, they nonetheless sensed that a definition with 20 exceptions would be a vulnerability, so they rewrote it.
This time, NetChoice could not convince the judge that the definition had a content-based exception. They did, however, convince the judge that the definition was overinclusive, noting that it included Dreamwidth (a blogging service). NetChoice won again, and Judge Shelby blocked Utah’s law in September 2024.
The slogan was “hold Big Tech accountable,” not “hold Dreamwidth accountable.” As Judge Shelby noted, “Dreamwidth is distinguishable in form and purpose from the likes of traditional social media platforms—say, Facebook and X.” Dreamwidth was not exactly a guilty party here. Courts will not give you a pass for punishing the innocent just because your legislation also punishes the guilty.
The Full Scope of a Law
Perhaps the time had come to reflect back and carefully study these failures before charting a path forward. The Institute for Family Studies, however, charged forward—this time with new model legislation for the App Store Accountability Act.
As we said earlier, we will assume that this is the usual case where “we must decide whether to apply strict or intermediate scrutiny.” And for the sake of argument, let’s assume that intermediate scrutiny applies.
Does the law survive intermediate scrutiny? Experts on both sides will bring out elaborate legal theories—and both would be willing to litigate them all the way to the Supreme Court. But before both sides take their legal battle that far, I propose that we first examine what the Supreme Court said the last time a battle like that took place.
Let us conduct the analysis that the Supreme Court laid out in Moody v. NetChoice (2024): “The first step in the proper facial analysis is to assess the state laws’ scope.”
In an op-ed, the IFS and others claimed that their legislation merely “requires brick-and-mortar stores to check ID for purchases of age-restricted products like cigarettes and alcohol.” This fatally flawed analogy grossly misstates the scope of their act.
The correct analogy would be checking ID before you can even enter a brick-and-mortar store, such as a Walmart or a 7/11. To use Utah’s legislation as an example, age verification does not kick in when a user downloads certain apps that are harmful to minors, but when the user “creates an account with the app store provider.”3
Do you want to download a Bible app? You must verify your age—and get parental consent if you’re a minor. The Microsoft Word app? You must verify your age—and get parental consent if you’re a minor. The Dreamwidth app? You must verify your age—and get parental consent if you’re a minor.
If defining social media was such a vexing problem, then perhaps some thought that we could avoid this problem by pivoting to app stores. Not so. The purpose of this definition was to control the scope of the law; we could tune the law’s scope by tuning the definition. The pivot to app stores, however, actually broadened the scope.
Returning to Moody, “[t]he next order of business is to decide which of the laws’ applications violate the First Amendment, and to measure them against the rest.” From there, “[t]he question is whether ‘a substantial number of [the law’s] applications are unconstitutional, judged in relation to the statute’s plainly legitimate sweep.’ ”
Here, the IFS focused on the “heartland applications” of this act, such as social media. But as we look at the full scope of the law, we can find a steady stream of examples—such as a Bible app or the Microsoft Word app—where the age verification and parental consent mandates do not even survive intermediate scrutiny.4
Even if the act is constitutional for those “heartland applications,” we can also identify a substantial illegitimate sweep—rendering the act unconstitutional.
If requiring age verification for Dreamwidth rendered a law overinclusive, then imagine how overinclusive a law must be if it requires age verification for a Bible app.
But, if a law can survive strict scrutiny, it is not necessary to decide which level of scrutiny applies. And if a law would not survive intermediate scrutiny (and we’ve ruled out rational-basis review), it is not necessary to decide whether intermediate or strict scrutiny applies.
While the Supreme Court assumed without deciding that heightened scrutiny applies (i.e., either intermediate or strict scrutiny applies), the DC Circuit Court unanimously decided that heightened scrutiny applies.
In arguing that there are “no [First Amendment] concerns here,” Joel Thayer heavily relied on Arcara v. Cloud Books (1986). That argument fell flat in the DC Circuit Court: “At the outset, we reject the Government’s ambitious argument that this case is akin to Arcara v. Cloud Books, Inc., 478 U.S. 697 (1986), and does not implicate the First Amendment at all.”
However, Utah deserves credit for narrowing the scope to mobile devices and mobile apps; the original model legislation applied to any “general purpose computing device.” Had Utah not narrowed the scope, “sudo apt install git
” would have triggered age verification; apt
would qualify as an app store, and git
would qualify as an app.
To be more precise, it fails the first part of intermediate scrutiny; I’m not aware of any “important government interest” that justifies age verification—and parental consent for minors—before you download a Bible app.