Define Social Media, Part II: Definitions
Age verification laws for social media have gone 0-5 in legal challenges. What needs to change so that these laws survive legal challenges?
(If you just want to see the definition of social media, use this link to skip to the end.)
0-5. If your football team went 0-5 (be it a professional or college team), it would be time to shake things up. State legislatures who have passed age verification laws for social media have gone 0-5 in legal challenges. Fortunately, this engineer—who has also served as a tech fellow in Congress—has already been figuring out how to fix it.
In part I, we developed model findings that explained the special characteristics of social media. In part II, we’ll dive into the brass tacks of defining social media.
I. The Problem: One Billion Websites
There are over a billion websites on the Internet, and that does not even include all the apps that have online functionality. For social media (or for any online medium) how do we accurately determine which sites are social media sites and which are not?
If you draft a definition with constitutional defects, a lawyer can certainly spot those defects. After you spot those defects, you then need to fix them, which leads back to the core technical challenge: how do you properly classify one billion websites?
A. Technical Challenges
On the surface, the obvious challenge is volume; one billion is a very big number. Beneath the surface, an even bigger challenge exists: variety. Many different types of websites exist, ranging from Facebook to Wikipedia to Netflix.
In the past, attempts to regulate technology were simpler, because the mediums of old were tightly coupled with hardware. Broadcast was defined by an over-the-air signal that an antenna would receive. Cable TV was defined by, well, cable; a cable channel would use “a portion of the electromagnetic frequency spectrum” (see 47 U.S.C. 522(4)).
Online mediums, however, are tied to software—which is much more malleable. A social media site, an app store, and an e-commerce site are all examples of software, but each piece of software yields a radically different medium. The variety of sites available is due in large part to the softness of software.
When writing a definition, including sites that are social media sites is important, but excluding sites that are not social media sites is just as important if not more important. If we are writing an age verification law, we do not want to make people verify their age if they want to leave a review on Yelp.
Yelp is a “negative example,” an example of a site that is not a social media site. Negative examples will play a critical role in developing the definition. As we develop that definition, this phrase will frequently be used: “necessary but not sufficient.”
We will start with a base definition that includes all one billion websites, and we will then incrementally add pieces to that definition. When we add a new piece, this piece will often be motivated by a negative example that the definition needs to exclude. But even with this piece, we can still find other negative examples that are included in the definition. Thus, while this piece is necessary, it’s not sufficient by itself.
B. Legal Challenges
Laws regulating social media will face heightened scrutiny, but the level of scrutiny depends on whether the law is content-based or content-neutral. To make that determination, courts will look at both what the law does, and who the law applies to.
The definition of social media matters because it determines who the law applies to.
Thus far, state legislatures have gone 0-5 when it comes to writing a content-neutral definition of social media. The most common pitfall here is the exceptions.
If you write a definition with thirteen exceptions like Arkansas did, NetChoice will have an easy time convincing the judge that your definition is content-based. Additionally, if a law has thirteen exceptions, it gives lawyers thirteen chances to shoot a law down by proving that one of those exceptions is content-based.
But in some cases, one well-placed shot is enough to bring a law down. In Mississippi, the judge ruled that an exception for “news, sports, commerce, [or] online video games” was content-based. In Ohio, the judge similarly ruled, “The exceptions to the Act for product review websites and ‘widely recognized’ media outlets, however, are easy to categorize as content based.”
Laws are not content-based, however, if they target one medium but not another. Regulations can be justified by the special characteristics of a medium (see Turner Broadcasting System v. FCC (1994)). The Internet is not a monolithic medium; it contains many distinct mediums, such as social media, search, and e-commerce.
But while a definition cannot use a content-based exception to exclude Yelp, if the definition includes Yelp, that creates a different constitutional problem: the definition is not narrowly tailored.
In Utah, the judge criticized their definition of social media because it included Dreamwidth, which is “distinguishable in form and purpose from the likes of traditional social media platforms.” (Dreamwidth is a blogging service that is similar to WordPress, Tumblr, and Medium.)
Finally, what happens if Snapshot cannot determine if it is a social media site, according to the definition? That creates another constitutional problem: vagueness.
Sometimes, this can get technical; for example, multiple judges (though not every judge) have said that phrases such as “primary purpose” are too vague. When Arkansas’s own witnesses could not agree on what the “primary purpose” of Snapchat is, however, that effectively settled that case; their law was void for vagueness.
C. Context Matters
Before we dive into the brass tacks, there is one last important point: the definition may depend in part on the problem we’re trying to solve and the proposed solution.
As a practical example, why can’t we just use the definition in 42 U.S.C. 1862w(a)(2)?
(5) SOCIAL MEDIA PLATFORM.—The term “social media platform” means a website or internet medium that—
(A) permits a person to become a registered user, establish an account, or create a profile for the purpose of allowing users to create, share, and view user-generated content through such an account or profile;
(B) enables 1 or more users to generate content that can be viewed by other users of the medium; and
(C) primarily serves as a medium for users to interact with content generated by other users of the medium.
Taking a quick look, many negative examples will be classified as a social media platform under this definition—which means the definition is not narrowly tailored. (Additionally, the term “primarily” might create some issues in terms of vagueness.)
So why haven’t courts struck down this definition? Look at the rest of 42 U.S.C. 1862w. This definition is used in a law that studies social media’s impact on human trafficking. But what if we use this same definition in a law that regulates social media—as opposed to a law that only studies social media? It probably won’t end well.
A couple more examples will further illustrate this point. First, Florida passed an anti-censorship law for social media (SB 7072, 2021), and it also passed an age verification law for social media (HB 3, 2024). If you compare these two laws, you will find some major differences in how each law defines “social media platform.” Since these two laws are solving two very different problems, though, it’s not surprising that their definitions of “social media platform” would be different as well.
Second, let’s look at a popular federal bill, the Kids Online Safety Act (KOSA). KOSA’s definition of “covered platform” is fairly broad and is not limited to social media. A broader definition makes sense, however, when you look at the solution KOSA is proposing. Since KOSA relies on more light-touch (yet effective) regulations, it does make sense to apply those regulations to a broader set of sites, not just social media.
So what’s the context for the definition that we’re about to create? This definition will be used for child safety legislation. Specifically, it will be used for age verification—though this definition can also be reused for other types of child safety legislation.
For an age verification bill, we will obviously need to write a pretty tight definition of social media. The impact of misclassifying Yelp as a social media site is much more severe for a law that requires age verification (as opposed to extra paperwork). However, if our definition is tight enough that it can be used in an age verification bill, this definition can probably be reused for other child safety bills as well.
II. Classifying One Billion Websites
A. Starting Point
The law is not a creative writing discipline. The goal is to clearly explain to people what the law demands of them—demands that often come with severe penalties if they are violated. In this domain, copying the work of others is good!1
If an existing legal term has a well-understood meaning, or if another law or bill has a well-built definition, just reuse it. Don’t reinvent the wheel. The question is not about you—and whether your legislative work is original. The question is about the people who must obey this law—and whether they understand what the law demands of them.
Where do we start with a definition? Let’s start by reusing the definition of “interactive computer service” from a federal law known as Section 230—the law that says platforms are generally not liable for the third-party content they host.
SOCIAL MEDIA PLATFORM.—The term “social media platform” means an interactive computer service that…
INTERACTIVE COMPUTER SERVICE.—The term “interactive computer service” has the meaning given the term in section 230(f)(2) of the Communications Act of 1934 (47 U.S.C. 230(f)(2)).
For reference, here is Section 230’s definition of “interactive computer service”:
INTERACTIVE COMPUTER SERVICE.—The term “interactive computer service” means any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions.
Despite the age of Section 230, which was passed in 1996, this definition of “interactive computer service” is still widely used today. If it’s not broken, don’t fix it.
It also has a modern advantage: it covers both websites and apps. In this definition, it does not matter whether users access Facebook via facebook.com or via the Facebook app; Facebook is an “interactive computer service” either way. (For the sake of convenience, though, we’ll use website/site as a shorthand term for “interactive computer service”—which includes both websites and apps.)
Of course, all one billion websites would qualify as an interactive computer service, so we’ll need to narrow this definition.
B. Content Moderation at Scale is Hard
Section 230 makes a distinction between first-party content and third-party content. When defining social media, that distinction is necessary but not sufficient. Facebook heavily relies on third-party content, but so does Netflix.
1. Big Data and the Three Vs
Back in 2014, Facebook reported that it generates 4 petabytes of data per day (4 petabytes = 4,000,000 gigabytes). By comparison, a typical smartphone has 64 gigabytes of storage, and one of the largest hard drives on the market offers 30 terabytes of storage (30 terabytes = 30,000 gigabytes).
It suffices to say that 4 petabytes of data won’t fit onto a single computer.
Welcome to the world of Big Data. Big Data is defined by the three Vs: volume, velocity, and variety. For example, 4 petabytes is certainly a very large volume of data, and at 4 petabytes per day, the velocity is approximately 46 gigabytes per second.
The three Vs don’t just present technical challenges, either. They also present social challenges. When millions of users are generating content each day, that definitely makes content moderation hard, in terms of both volume and velocity. And in terms of variety, content can cover virtually any topic, and a global social media platform will have content in many different languages.
2. Daily Active Content Providers
So how do we distinguish between Netflix and Facebook? Netflix doesn’t have millions of users who produce content each day. Facebook does. Simply put, scale is the differentiator. After all, our narrative is that content moderation at scale is hard.
In the tech industry, many companies measure their daily active users and monthly active users. We can use this metric as a starting point, but we’ll need to refine it.
First, should we measure daily active users or monthly active users? The answer is daily active users. Velocity—one of our three Vs—is much easier to see at the daily level. Additionally, monthly active users does not distinguish between a user who spends hours on Twitter/X every day and a user who logs in to Twitter/X once a week or once a month; both count as one monthly active user.
Second, we need to tweak what we measure: daily active content providers, not daily active users. If only ten users are producing content but millions of users are viewing that content, content moderation is fairly simple. Both Netflix and Facebook have millions of users, but only Facebook has millions of content providers.
3. Legislative Text
So how do we translate that to legislative text? Fortunately, we have a couple of definitions we can reuse here: the definition of “information content provider” from Section 230, and the definition of “user” from the Kids Online Safety Act (KOSA).
SOCIAL MEDIA PLATFORM.—The term “social media platform” means an interactive computer service that has averaged at least 1,000,000 daily active content providers over the previous 180 days.
DAILY ACTIVE CONTENT PROVIDERS.—The term “daily active content providers” means the number of users who serve as an information content providers during a single day.
USER.—The term “user” means, with respect to a social media platform, an individual who registers an account or creates a profile on the social media platform.
INFORMATION CONTENT PROVIDER.—The term “information content provider” has the meaning given the term in section 230(f)(3) of the Communications Act of 1934 (47 U.S.C. 230(f)(3)).
For reference, here is Section 230’s definition of “information content provider”:
INFORMATION CONTENT PROVIDER.—The term “information content provider” means any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.
Setting the threshold for daily active content providers is more of an art than an exact science; 100,000 daily active content providers would also be a reasonable threshold.2
C. Scale is Necessary But Not Sufficient
Setting the threshold at 1,000,000 (or even 100,000) daily active content providers will filter out most of those one billion websites, but it’s not a silver bullet. We can still find many negative examples that are included in the definition. Going forward, we will pessimistically assume that scale is necessary but not sufficient.
1. Primary vs. Secondary Content
What about the comments section of the New York Times? The number of users who comment on New York Times articles is much larger than the number of writers who create those articles. And while the New York Times may not have 100,000 or 1,000,000 users commenting each day, perhaps a different site could hit that threshold—which is why we pessimistically assume that scale is necessary but not sufficient. This example will justify that pessimism: what about product reviews on Amazon?
A common mistake here is to create a narrow exception for product review sites. Courts will often rule that that these narrow exceptions are content-based when, to quote an Ohio judge as an example, “a product review website is excepted, but a book or film review website, is presumably not.”
Instead, we can distinguish between “primary” content and “secondary” content. The New York Times article or the product on Amazon would be the primary content, while a comment on the article or a product review would be the secondary content. Secondary content depends on primary content; you cannot add a product review to Amazon if you don’t first have a product to review.
With that in mind, let’s refine the definition of “information content provider”:
INFORMATION CONTENT PROVIDER.—
(A) IN GENERAL.—The term “information content provider” has the meaning given the term in section 230(f)(3) of the Communications Act of 1934 (47 U.S.C. 230(f)(3)).
(B) SECONDARY CONTENT EXCLUDED.—The term “information content provider,” with respect to a social media platform, does not apply to content that depends on other content on the social media platform, such as comments on an article, reviews for a product, or replies to a post.
This exception is written in a content-neutral fashion; product reviews are treated no differently than book reviews or film reviews.3 (A key point here is that the “such as” clause only provides illustrative examples, not an exhaustive list of examples.4 )
Additionally, we can further justify this exception with one of the three Vs: variety. When user-generated content only consists of comments for a small set of articles or reviews for products, the variety of content found on a site is much smaller.
2. Commercial Content
But even if we exclude Amazon reviews, how many people are creating or editing product listings on Amazon each day? How many users create auctions on eBay each day? Again, we pessimistically assume that scale is necessary but not sufficient.
While sites like Amazon and eBay do rely on third-party content, the design of these sites effectively ensures that this content is commercial in nature. We can capture that idea in another exception for “information content provider”:
INFORMATION CONTENT PROVIDER.—
(A) IN GENERAL.—The term “information content provider” has the meaning given the term in section 230(f)(3) of the Communications Act of 1934 (47 U.S.C. 230(f)(3)).
(B) SECONDARY CONTENT EXCLUDED.—The term “information content provider,” with respect to an interactive computer service, does not apply to content that depends on other content on the interactive computer service, such as comments on an article, reviews for a product, or replies to a post.
(C) COMMERCIAL CONTENT EXCLUDED.—The term “information content provider,” with respect to an interactive computer service, does not apply to content that is designed by the interactive computer service to facilitate commerce, such as product listings, available drivers for ridesharing, or booking information for accommodations.
This exception is content-neutral because it makes a medium-based distinction; social media and e-commerce are two fundamentally different mediums.5
For example, if we have a high-volume third-party seller that’s engaging in fraud, verifying the age of the seller would do little to stop that fraud. The regulations created by the INFORM Consumers Act are the answer here. Conversely, it’s also nonsensical to apply INFORM to social media, making social media platforms collect the bank account information, contact information, and tax ID numbers of their users.
3. Directed to a General Audience
Another interesting negative example is LinkedIn. Despite the similarities between LinkedIn and social media, LinkedIn is still intuitively different in form and purpose from social media sites. We can reasonably assume that most parents would not be too concerned if they discovered that their kid secretly created a LinkedIn account.
As before, we need to resist the temptation to exclude LinkedIn by creating a narrow exception for, e.g., professional networking sites. LinkedIn may also not be the only site we need to worry about that has a more specialized purpose; we need an exception that treats all these specialized sites the same:
SOCIAL MEDIA PLATFORM.—The term “social media platform” means an interactive computer service that—
(A) is directed to a general audience, notwithstanding whether content is delivered via text, images, audio, video, or other types of media content; and
(B) has averaged at least 1,000,000 daily active content providers over the previous 180 days.
In terms of content neutrality, this exception is similar to our exception for secondary content. First, it treats all specialized sites the same; it doesn’t privilege certain types of specialized sites.6 Second, it can be justified by one of our three Vs: variety. It suffices to say that the variety of content found on Instagram or Snapchat is vastly bigger than the variety of content found on LinkedIn.
D. The Distribution Model Matters
What about a crowdsourced encyclopedia like Wikipedia? What about email or text messaging? Applying our pessimistic assumption that scale is necessary but not sufficient, the current iteration of our definition still includes these sites.
1. A Variety of Distribution Models
Previously, we focused on how content is created. What if we focus on how content is distributed? It suffices to say that social media’s content distribution model is very different from the distribution model of Wikipedia or text messaging.
Distinguishing between sites based on their distribution model tends to be content-neutral as well. It does not discriminate based on what type of content a site contains, and it enters the territory of a medium-based distinction. Different mediums will have different distribution models; each model presents its own unique set of problems.
Thus, the third plank of our definition will focus on the distribution model:
SOCIAL MEDIA PLATFORM.—The term “social media platform” means an interactive computer service that—
(A) is directed to a general audience, notwithstanding whether content is delivered via text, images, audio, video, or other types of media content;
(B) has averaged at least 1,000,000 daily active content providers over the previous 180 days; and
(C) socially distributes content.
SOCIALLY DISTRIBUTES CONTENT.—The term “socially distributes content” means…
The challenge here lies in defining “socially distributes content.”
2. The Social Network
What is the social component of socially distributing content? As a starting point, Facebook users your social network to distribute content to you; Wikipedia does not. Let’s incorporate the concept of a social network into our definition:
SOCIALLY DISTRIBUTES CONTENT.—The term “socially distributes content” means making decisions about which content to distribute to a user, where such decisions use the user’s social relations with other users, such as other users that the user follows or is friends with.
This definition focuses on decision-making: is this data used to make a decision? Using your social network for other purposes such as research is not covered by that definition; the social network must be used in decisions about content distribution. (Of course, social media sites may take other factors into account when making these decisions, so we only require that the social network be one of the factors used.)
This definition is also crystal-clear; it doesn’t require the courts to make judgments about what the “primary purpose” or “primary function” of a site is.
3. Who Controls Content Distribution?
Social media, however, is not the only social network. If one user subscribes to another user’s newsletter, that’s a social relation that’s used in content distribution; newsletters are distributed to subscribers. If many users are all part of a group chat, that’s a social relation that’s used in content distribution. (Recall that the “such as” clause only provides illustrative examples, not an exhaustive list of examples.)
The next aspect we can look at is who controls distribution. With many of the online mediums that predate social media, consumers and producers (mostly) controlled content distribution. If you send an email, you control who the recipients of that email are. If you don’t like a particular Substack newsletter, you can just unsubscribe.
Of course, this direct distribution model does have one problem: spam. But if spam was the worst of our problems on social media, states wouldn’t be trying to pass age verification laws, and Congress wouldn’t be trying to pass the Kids Online Safety Act.
Search engines typically put users in control, too. While you don’t control the results you receive, you do control the search query that you type into google.com. Moreover, search engines have an incentive to find results that are relevant to your search query.
With that in mind, we can add some exclusions to the definition:
SOCIALLY DISTRIBUTES CONTENT.—
(A) IN GENERAL.—The term “socially distributes content” means means making decisions about which content to distribute to a user, where such decisions use the user’s social relations with other users, such as other users that the user follows or is friends with.
(B) DIRECT DISTRIBUTION EXCLUDED.—The term “socially distributes content” does not include, notwithstanding spam filtering, distributing the content of a user directly to recipients or subscribers, such as the recipients of an email, the members of a group chat, or the subscribers of a newsletter.
(C) SEARCH EXCLUDED.—The term “socially distributes content” does not include providing content to a user when the user deliberately and independently searches for, or specifically requests, content.
Here, the language for the search exception is based on language found in KOSA, which has a similar exception for its duty of care.
4. Engagement Data
At this point, the definition looks like it may finally be sufficient. It’s certainly hard to think of a negative example that is included in the definition.
Nonetheless, it may still be beneficial to harden this definition a bit more. And for all this talk of social media addiction and kids spending hours each day on social media, the definition does not include something that is closely tied to that problem.
In addition to looking at how social networks are used in content distribution, we can also look at how a user’s engagement with content is used in content distribution:
SOCIALLY DISTRIBUTES CONTENT.—
(A) IN GENERAL.—The term “socially distributes content” means means making decisions about which content to distribute to a user, where such decisions use—
(i) the user’s social relations with other users, such as other users that the user follows or is friends with; and
(ii) the user’s engagement or interest with content from other users, such as viewing, liking, reposting, or replying to content.
Instead of targeting a specific feature or a specific type of algorithm that is used to drive engagement, we target the fuel that powers these features and algorithms: engagement data.7 This approach offers a good blend of specificity and flexibility.
All three Vs are present. Volume comes from distributing content to millions of users. Variety comes from a distribution model that is hyper-personalized in nature. Velocity comes from the algorithms that constantly process new data about the user’s activities.
With millions of users, a tech company has limited bandwidth to address problems that affect a single user. With a hyper-personalized distribution model, though, many of the problems are also hyper-personalized in nature.
E. Liability Applies at the Top of the Stack
Even for a single site, many companies often play a role in running that site. Beneath the surface of a social media site, there’s a complex technical stack of infrastructure—with different companies handling different parts.
To store billions of posts, a social media site may rely on a cloud storage provider like Amazon Web Services. To ensure that hackers cannot take down that site with a DDoS (distributed denial of service) attack, the site may rely on DDoS protection from Cloudflare. And of course, an ISP like Comcast delivers that content to users.
In many cases, liability should only be applied at the “top of the stack.” Facebook should be held accountable for its actions, but liability should not apply to ISPs or the companies that provide technical infrastructure for social media sites. In addition to defining a “social media platform,” let’s also define a “social media company”:
SOCIAL MEDIA COMPANY.—
(A) IN GENERAL.—The term “social media company” means a person or entity that provides a social media platform.
(B) TECHNICAL INFRASTRUCTURE EXCLUDED.—The term “social media company” does not include a person or entity acting in its capacity as a provider of—
(i) a common carrier service subject to the Communications Act of 1934 (47 U.S.C. 151 et seq.) and all Acts amendatory thereof and supplementary thereto;
(ii) a broadband internet access service (as such term is defined for purposes of section 8.1(b) of title 47, Code of Federal Regulations, or any successor regulation); or
(iii) an interactive computer service that is used by a social media platform for the management, control, or operation of that social media platform, including for services such as web hosting, domain registration, content delivery networks, caching, security, back-end data storage, and cloud management.
The first two exceptions were copied from KOSA’s definition of “covered platform.” The third exception was largely copied from the Internet PACT Act; it’s not every day that we find a bill that mentions content delivery networks and caching.
And again, this exception is content-neutral. This is not a content-based question about what type of content the social media platform contains; it is a content-neutral question about where regulation occurs.
III. The Full Definition
Now that we have all the pieces of our definition, let’s see the whole product, including the findings from part I; the definitions logically flow from the findings.
SEC. _. FINDINGS
The Legislature finds the following:
(1) The State has a compelling interest in protecting the physical and psychological well-being of minors.
(2) The Internet is not a monolithic medium but instead contains many distinct mediums, such as social media, search, and e-commerce.
(3) Existing measures to protect minors on social media have been insufficient for reasons including—
(A) the difficulty of content moderation at the scale of a platform with millions of user-generated content providers;
(B) the difficulty of making subjective judgments via algorithms, such as identifying content that harms the physical or psychological well-being of minors; and
(C) limited interoperability between social media platforms and third-party child safety tools, in part due to privacy concerns about sharing user data with third parties.
(4) Social media companies have failed to control the negative impacts of their algorithms to distribute content for reasons including—
(A) the scale of a platform with millions of users, combined with the personalized nature of content distribution;
(B) the natural incentive of such companies to maximize engagement and time spent on their platforms; and
(C) the limited degree of control that users have over the content they receive.
(5) Limited accountability exists on social media platforms for bad actors, especially given the anonymous or hard-to-track nature of many such actors.
(6) Users frequently encounter sexually explicit material accidentally on social media.
(7) Social media platforms are accessible—
(A) from a wide variety of devices, ranging from an individual’s smartphone to a laptop at a friend’s house to a desktop in a public library; and
(B) via a variety of methods on a single device, including apps and websites.
SEC. _. DEFINITIONS
In this Act:
(1) DAILY ACTIVE CONTENT PROVIDERS.—The term “daily active content providers” means the number of users who serve as an information content providers during a single day.
(2) INFORMATION CONTENT PROVIDER.—
(A) IN GENERAL.—The term “information content provider” has the meaning given the term in section 230(f)(3) of the Communications Act of 1934 (47 U.S.C. 230(f)(3)).
(B) SECONDARY CONTENT EXCLUDED.—The term “information content provider,” with respect to an interactive computer service, does not apply to content that depends on other content on the interactive computer service, such as comments on an article, reviews for a product, or replies to a post.
(C) COMMERCIAL CONTENT EXCLUDED.—The term “information content provider,” with respect to an interactive computer service, does not apply to content that is designed by the interactive computer service to facilitate commerce, such as product listings, available drivers for ridesharing, or booking information for accommodations.
(3) INTERACTIVE COMPUTER SERVICE.—The term “interactive computer service” has the meaning given the term in section 230(f)(2) of the Communications Act of 1934 (47 U.S.C. 230(f)(2)).
(4) SOCIAL MEDIA COMPANY.—
(A) IN GENERAL.—The term “social media company” means a person or entity that provides a social media platform.
(B) TECHNICAL INFRASTRUCTURE EXCLUDED.—The term “social media company” does not include a person or entity acting in its capacity as a provider of—
(i) a common carrier service subject to the Communications Act of 1934 (47 U.S.C. 151 et seq.) and all Acts amendatory thereof and supplementary thereto;
(ii) a broadband internet access service (as such term is defined for purposes of section 8.1(b) of title 47, Code of Federal Regulations, or any successor regulation); or
(iii) an interactive computer service that is used by a social media platform for the management, control, or operation of that social media platform, including for services such as web hosting, domain registration, content delivery networks, caching, security, back-end data storage, and cloud management.
(5) SOCIAL MEDIA PLATFORM.—The term “social media platform” means an interactive computer service that—
(A) is directed to a general audience, notwithstanding whether content is delivered via text, images, audio, video, or other types of media content;
(B) has averaged at least 1,000,000 daily active content providers over the previous 180 days; and
(C) socially distributes content.
(6) SOCIALLY DISTRIBUTES CONTENT.—
(A) IN GENERAL.—The term “socially distributes content” means means making decisions about which content to distribute to a user, where such decisions use—
(i) the user’s social relations with other users, such as other users that the user follows or is friends with; and
(ii) the user’s engagement or interest with content from other users, such as viewing, liking, reposting, or replying to content.
(B) DIRECT DISTRIBUTION EXCLUDED.—The term “socially distributes content” does not include, notwithstanding spam filtering, distributing the content of a user directly to recipients or subscribers, such as the recipients of an email, the members of a group chat, or the subscribers of a newsletter.
(C) SEARCH EXCLUDED.—The term “socially distributes content” does not include providing content to a user when the user deliberately and independently searches for, or specifically requests, content.
(7) USER.—The term “user” means, with respect to a social media platform, an individual who registers an account or creates a profile on the social media platform.
Of course, some legal details may need to be ironed out, but Congress and most state legislatures have a service known as Legislative Counsel, which handles the legal details of drafting legislation.
In fact, Legislative Counsel can work with things that are far less structured than draft legislative text. Part of their job is to take legislative proposals written in plain English and translate them to legislative text; this helps support a legislature whose representatives are teachers, doctors, and engineers—not just lawyers.
This practice of reusing legislative text is similar to how software engineers often reuse built-in or third-party libraries, instead of reinventing the wheel. If asked to sort a list of numbers, a Java programmer will not write their own sorting algorithm. They’ll use the Java API to sort those numbers with a single line of code: Collections.sort(numbers)
.
Lest anyone allege that the threshold for daily active content providers was set at 1,000,000 for nefarious content-based reasons, here’s the actual methodology: I went through the powers of 10 (1, 10, 100, 1,000, etc.) until I hit a number that was large enough: 1,000,000.
An interesting wrinkle is that we also exclude users who reply to other user-generated posts but do not create their own posts. The only “harm” here is that the definition of “daily active content providers” may slightly undercount the actual number of such providers.
The canon of construction that would apply here is the presumption of nonexclusive “include.”
Facebook is an interesting case, since it also has the Facebook Marketplace. Applying the definition here, users who only create content on the Facebook Marketplace do not count as an “information content provider”—and thus are not included in the count of “daily active content providers.” Users who create normal Facebook posts (or who create both types of content) do count as an “information content provider.”
The “notwithstanding whether content is delivered” clause clarifies that both Instagram and Twitter/X are directed to a general audience. Instagram cannot argue that it’s not directed to a general audience because it primarily relies on images, while Twitter has a broader audience since it uses text, images, and videos.
However, if the user is reporting content, blocking someone, or clicking a button that tells the platform they don’t want to see this content, that would not count as “engagement or interest.” If a site personalizes content distribution based on those signals, it wouldn’t qualify as a social media site. (And of course, this has no effect on content moderation measures that are not personalized, such as taking a post down if it violates the rules.)