February 11, 2025
Social Media: Regulatory, Legal, and Policy Considerations for
the 119th Congress
Social media platforms enable users to create and share protected expression, they served compelling government
content and interact with other users’ content. A diverse set interests without burdening too much protected speech.
of platforms disseminate information to billions of people,
and are used by most American adults. Social media Data Protection
operators may moderate the content on their site, promoting Congress has enacted statutes that regulate data collected by
some posts and disallowing others. Some of their business certain industries or data that fall within certain categories.
decisions may be governed by existing federal laws, but For example, the Gramm-Leach-Bliley Act imposes data
social media platforms are not comprehensively regulated protection obligations on financial institutions, and the
in the United States. Some lawmakers have expressed Children’s Online Privacy Protection Act regulates the
concerns about issues related to social media use, including online collection and use of information about children
the spread and promotion of content believed to be harmful; younger than 13. In addition, the Federal Trade
platforms’ restriction of lawful speech; and the lack of Commission sometimes brings enforcement actions
privacy protections. Members of the 119th Congress have alleging that companies’ data protection practices constitute
already introduced bills to address some of these topics, “unfair or deceptive acts or practices.” Congress has not
including a bill restricting kids’ use of social media. enacted a comprehensive data protection law.
Meanwhile, state-level regulation has faced legal
challenges, as the Free Speech Clause of the U.S. Legal Protections for Hosting or Restricting Speech
Constitution’s First Amendment imposes some limits on The First Amendment protects the right to create, circulate,
certain regulations of social media platforms. or receive content online by constraining the government’s
ability to regulate this activity. The Supreme Court has also
Existing Federal Regulation of Social recognized a right of editorial control when private
Media Platforms and Certain Content platforms choose whether or how to publish others’ speech.
In addition, courts have interpreted Section 230 of the
Distribution of Sexually Explicit Material Communications Act of 1934 to bar liability for publishing,
By statute, Congress has prohibited the knowing promoting, restricting, and sometimes even editing third-
distribution of certain material in interstate or foreign party content. Section 230 does not bar liability if a social
commerce, including over the internet. Federal law has long media platform helps develop content, and it contains
criminalized the distribution of “obscene” material, a subset exceptions allowing certain types of lawsuits.
of pornographic content. Because sexual expression is
generally protected under the First Amendment, the State Regulation of Social Media
Supreme Court has adopted a definition of obscenity that Some states have adopted laws regulating social media
exempts material with serious literary, artistic, political, or platforms and online content. As discussed below, courts
scientific value. Federal law also prohibits accessing or have enjoined (i.e., barred) enforcement of some of these
distributing child sexual abuse material (CSAM), referred laws while legal challenges to them are litigated.
to in statute as “child pornography.” Material that qualifies
as obscenity or child pornography is considered Some laws have attempted to address the content hosted
“unprotected speech,” meaning the government can prohibit online. For instance, the California Age-Appropriate Design
it, subject to certain First Amendment limits. In 2002, the Code Act (CAADCA) requires covered sites to assess and
Supreme Court invalidated on free speech grounds an mitigate the risk their product will expose children to
amendment to the CSAM statute prohibiting material that harmful content. Florida and Texas have enacted laws
“appears to” depict a minor engaged in sexual conduct, restricting online platforms’ ability to moderate user
because it would have prohibited even non-obscene movies content. Texas’s law, for example, prohibits covered
with adult actors. The case may have implications for platforms from censoring users based on viewpoint.
images generated or altered with artificial intelligence.
Other state laws have focused not on specific content
A 2022 federal law authorizes individuals whose intimate moderation decisions but on broader questions of who can
images were disclosed without their consent to sue the access websites and how content is delivered to users.
disclosing party in federal court. Many cases involving Many of these laws are aimed at protecting children. Some
these claims are in the early stages, with no reported rulings states have adopted laws requiring social media sites to
on free speech defenses as of the date of this writing. Some verify a user’s age and obtain parental consent. Other state
courts have rejected First Amendment challenges to similar laws require age verification only for sites with a certain
state laws. Those courts ruled that while the laws restricted amount of sexually explicit content, or limit the use of
features that may be addictive or otherwise harmful.
https://crsreports.congress.gov
Social Media: Regulatory, Legal, and Policy Considerations for the 119th Congress
Some states have enacted data privacy laws that apply expose children to harmful content. The court held this
broadly to the online collection or processing of personal requirement reached beyond commercial speech.
data. These laws often create individual rights to limit how
companies use personal data, such as a right to opt out of Laws regulating content moderation procedures without
the use of personal data for targeted advertising. focusing on the subject matter or ideas in that content might
trigger a lower standard of constitutional review. Laws that
Considerations for Congress are content neutral—that do not turn on a particular topic or
Past policy discussions have centered on whether and how viewpoint—are usually subject to a less demanding First
to regulate social media platforms and the user-generated Amendment test that is easier for the government to satisfy.
content they host and distribute. Bills in the 118th Congress
would have amended Section 230, regulated platforms’ Policy Considerations
content moderation procedures, created transparency In addition to constitutional considerations, policy
requirements, and supported third-party research of social considerations for Congress may include (1) addressing
media platforms. For example, the Kids Online Safety concerns regarding social media platforms and content,
Act—versions of which were passed by the Senate as part such as the spread of harmful content and misinformation
of the Kids Online Safety and Privacy Act in July 2024 (S. and data privacy; (2) ensuring a viable consumer-focused
2073) and ordered to be reported to the House in September tech sector driven by innovation and competitiveness; and
2024 (H.R. 7891)—would have imposed a “duty of care” (3) addressing the question of federal regulatory authority
and other regulations on certain online platforms reasonably over social media platforms.
likely to be used by minors.
Congress may weigh a range of options to address
First Amendment Litigation concerns. For example, Congress may continue to support
Courts have enjoined some state laws on First Amendment the current mix of federal and state regulation and industry
grounds, preventing them from going into effect. The self-regulation. Congress may also exercise oversight of
Supreme Court weighed in on the Florida and Texas existing regulatory frameworks, conducting investigations
content moderation laws in Moody v. NetChoice, LLC, 144 and hearings on the industry practice and agency
S. Ct. 2383 (2024), holding that some applications of the enforcement. Congress might incentivize social media
laws affect platforms’ protected rights to make editorial companies to establish voluntary or collaborative rules and
decisions about the content they display. The Court opined standards as a response to the pressure of stakeholders, the
that when Facebook and YouTube decide which third-party public, or potential litigation. Congress may assess court
content to display and how to organize that content, they opinions in litigations related to social media and determine
are making constitutionally protected expressive choices. whether Congress should provide legislative solutions.
Other laws limiting platforms’ ability to host or exclude Lastly, Congress may enact legislation that would provide
speech could infringe this right of editorial control. specific regulatory authority to federal agencies. If
Congress chooses to legislate, considerations may include
Apart from editorial control concerns, courts may apply the following:
heightened constitutional scrutiny to laws that target
specific types of online content. This heightened scrutiny • Covered Entities. Whether to cover entities operating
large social media platforms (e.g., those with a certain
makes it more difficult for the government to establish that
a challenged law is constitutional. Specifically, courts number of active users or specific revenue thresholds),
usually consider a content-based law—one that applies to some other subset of platforms, all social media
platforms, or all online platforms.
speech based on its subject matter, topic, or viewpoint—to
be presumptively unconstitutional. As mentioned, however, • Content Moderation. Whether to prohibit content
the government generally can prohibit so-called moderation, require moderation of defined harmful
“unprotected” categories of speech such as obscenity. In content, or provide flexibility regarding the choice of
January, the Supreme Court heard arguments in a case, moderated content. Congress might consider whether to
Free Speech Coalition v. Paxton, involving a Texas age- amend Section 230, for example, by reforming liability
verification requirement for certain websites. Because the protections for social media platforms’ content
law is aimed at protecting minors from sexually explicit moderation practices. Congress might consider
content, a lower court held that it is not subject to imposing transparency and accountability requirements,
heightened scrutiny and is constitutional. The parties such as disclosing social media algorithms and content
challenging that ruling argue that the law unconstitutionally moderation practices. Congress might also address
burdens adults’ right to access non-obscene sexual users’ rights regarding what content they see.
expression online.
• Enforcement. Whether an existing agency (e.g., the
Disclosure requirements may be subject to a different Federal Trade Commission or Federal Communications
constitutional analysis. Federal appeals courts largely Commission) or a new agency would enforce new
upheld disclosure provisions in Texas’s and Florida’s laws requirements established in law. Congress might also
after evaluating them under a lower level of constitutional consider whether to include a private right of action
scrutiny that applies to commercial speech. In contrast, a allowing lawsuits for violations of the law.
different federal appeals court concluded California’s
Peter J. Benson, Legislative Attorney
CAADCA violated the First Amendment by requiring
covered businesses to report on the risk that their services Valerie C. Brannon, Legislative Attorney
Victoria L. Killion, Legislative Attorney
https://crsreports.congress.gov
Social Media: Regulatory, Legal, and Policy Considerations for the 119th Congress
IF12904
Ling Zhu, Analyst in Telecommunications Policy
Disclaimer
This document was prepared by the Congressional Research Service (CRS). CRS serves as nonpartisan shared staff to
congressional committees and Members of Congress. It operates solely at the behest of and under the direction of Congress.
Information in a CRS Report should not be relied upon for purposes other than public understanding of information that has
been provided by CRS to Members of Congress in connection with CRS’s institutional role. CRS Reports, as a work of the
United States Government, are not subject to copyright protection in the United States. Any CRS Report may be
reproduced and distributed in its entirety without permission from CRS. However, as a CRS Report may include
copyrighted images or material from a third party, you may need to obtain the permission of the copyright holder if you
wish to copy or otherwise use copyrighted material.
https://crsreports.congress.gov | IF12904 · VERSION 1 · NEW