The App Store Accountability Act: A Cybersecurity Disaster
Forcing app stores to share age data with apps is a very bad idea.
The App Store Accountability will force app stores to share your kid’s age with EVERY app. What could possibly go wrong? A lot. This legislation is a cybersecurity disaster. It is as if we’re doing surgery without a doctor in the room.
In the annals of bad tech bills, few are as infamous as the Stop Online Piracy Act of 2011. Then-Rep. Jason Chaffetz famously said, “We’re going to do surgery on the Internet, and we haven’t had a doctor in the room tell us how we’re going to change the organs. We’re basically going to reconfigure the Internet and how it’s going to work without bringing in the nerds.”
Apps are as ubiquitous today as the Internet was back in 2011, and the App Store Accountability Act—an age verification bill for app stores—would perform surgery on not just app stores, but the entire app ecosystem. But who are the surgeons here, and do they know what they’re doing?
As part of its Family First Technology Initiative, the Institute for Family Studies (IFS) has proposed model legislation for the App Store Accountability Act. At the end of last year, bills based on this model legislation were introduced in both the House of Representatives and the Senate. Recently, South Carolina and Utah introduced bills based on this model.
But will the IFS’s proposed surgery work? Frankly, this engineer—one who has supported age verification for both social media and adult sites—would describe it as medical malpractice.
The Adversarial Mindset
Anyone can create an app—both good actors and bad actors alike. The App Store Accountability would force app stores to share data about your child age’s with all apps—including the bad actors.
When the nerds build products, they have to apply the adversarial mindset. They have to consider not just how regular users will use their product, but also how hackers will try to abuse this product. Security is a de facto requirement for software—unless you want to get hacked.
At its core, cybersecurity is about that never-ending war between attackers and defenders. And it would be fair to say that the Internet can be a very dark place—especially in the corners of the dark web where many hackers reside.
If you’re going to do surgery on the entire app ecosystem, you cannot ignore this war. You, too, must apply the adversarial mindset and consider how attackers might exploit the changes that your legislation will mandate. Security is a de facto requirement for your legislation. (And even if you add privacy protections to your law, it suffices to say that bad actors often don’t obey the law.)
Imagine that bad actors created an app that impersonated Pokémon Go. (Last year, an app impersonating LastPass, a popular password management tool, found its way into Apple’s app store.) As a parent, you give your kids permission to use this app—as you think it is the real thing, not an impostor—giving this app access to real-time location. And thanks to the App Store Accountability Act, app stores are forced to share the age category (e.g., 13-15) of every user with this app.1
In essence, we’ve given predators a database containing the real-time location of kids. And that’s only one of many examples of what could possibly go wrong. A more conventional concern would be data brokers—who don’t always act above-board—acquiring this age data and combining it will all the other information they’ve collected about your kid.
What is stopping the attackers here? The only barrier is the app review process that occurs before an app is added to the app store. And if you believe that Apple and Google cannot be trusted to protect our kids, it does seem odd that you would trust their app review process—and would assume that this process will stop the bad actors who are trying to get their apps into the app store.
But regardless of how you view Apple and Google, this mandate to force app stores to share age data with apps is extremely unwise from a cybersecurity perspective.
Zero Trust
What level of trust should we grant to apps in the app store? From a cybersecurity perspective, experience has shown that you should default to zero trust. In the workplace context, the zero trust mindset assumes that attackers will find a way to breach your corporate network—no matter how good the defenders on your IT team are at securing it—and designs cybersecurity with that in mind. (The zero-trust mindset is not meant as an indictment of your IT team.)
When it comes to app stores, that same zero-trust default should apply. It would be reasonable to assume that—no matter how good Apple and Google’s app review process is—bad actors will find their way inside their app stores. (This is not meant as an indictment of Apple or Google.) And in a zero-trust environment, does it make sense to share age data with untrusted apps? Absolutely not.
The key architects of this model legislation, however, seem to think that it is secure because it uses encryption. In an op-ed, they wrote, “The app store could then transmit the age of the minor user to apps upon download via an anonymous, encrypted signal that indicates whether the user is age-eligible for their product, or not.” Here, it would help to explain what encryption can and cannot do.
If two parties that trust each other are communicating with each other, encryption can ensure that attackers cannot eavesdrop and intercept their communications. However, if legislation is forcing one party to send information to an untrusted party—such as forcing app stores to share age data with an untrusted app—encryption won’t solve that problem.
The Defender’s Dilemma
Another dilemma that we face here is the defender’s dilemma. Defenders tend to have normal(ish) working hours, a spouse and kids, a social life outside of work, etc. Attackers tend to have way too much free time on their hands (and some may live in their mother’s basement), but they are damn good at hacking. Defenders often have to defend a large and complex landscape, and they have to be right 100% of the time. Attackers can attack from anywhere on that landscape, and they only need to be right once. Attackers often outnumber defenders.
For app stores in particular, they have to defend a very large and complex landscape—one with all three Vs of Big Data: volume, velocity, and variety. Both Google and Apple each have about 2 million apps in their app store (volume). They have to review not just the original app, but a constant stream of app updates (velocity). And the apps that they review can be for just about anything (variety).
With all that in mind, how does the app review process manage to keep bad actors out of the app store? The short story is that it doesn’t always work, and it’s not hard to find a treasure trove of stories where it doesn’t work. I say this not to dunk on Apple and Google—despite my misgivings about them—as this would be a very challenging problem for any tech company to solve.
You can certainly understand why the core assumption of “zero trust” makes sense: that some attackers will find their way inside app stores—no matter how good the app review process is.
Minimizing Surface Area
One consequence of the defender’s dilemma is that surface area matters. How large of a surface area are you asking defenders to protect? It’s clear that the IFS did not think enough about that.
The earlier example I gave—impersonating Pokémon Go—is only one of many possible paths. To combine real-time location data with age data, you don’t need to impersonate Pokémon Go; any app that relies on real-time location data is a potential target. Or, instead of impersonating an app, you could also build a Trojan Horse app that act that looks and behaves like a normal app on the outside, but whose real purpose is to help predators.
And that’s not the only way this age data could be misused. As we mentioned earlier, a more conventional concern would be data brokers acquiring this data. A recommendation algorithm run by bad actors could use that age data to help pair kids with predators.
That’s already a very large surface area, and these are mostly examples I came up with off the top of my head. Imagine what bad actors—who have way more time than me—could come up with. Trying to poke a few holes in some of these examples is a fool’s errand, as the hackers—who are smarter than you and also me—could certainly figure out ways to patch those holes, and could also come up with additional examples.
Simply put, forcing app stores to share age data with apps is indefensible—in more ways than one.
Creating an API where an app could do a parental consent check with the app store would also reveal the age of a user. If the parental consent check fails, you would then know that the user is a minor. (And of course, bad actors may ignore a failed parental consent check.)