Connecticut 2025 2025 Regular Session

Connecticut House Bill HB05474 Comm Sub / Analysis

Filed 03/24/2025

                     
Researcher: JM 	Page 1 	3/24/25 
 
 
 
OLR Bill Analysis 
HB 5474  
 
AN ACT CONCERNING SOCIAL MEDIA PLATFORMS AND MINORS.  
 
SUMMARY 
This bill adds additional protection for minors using social media 
platforms by (1) requiring platform owners, by January 1, 2026, to 
incorporate an online safety center and establish a cyberbullying policy 
for handling cyber bullying reports on the platform and (2) expanding 
the Connecticut Data Privacy Act to include additional safeguards (e.g., 
avoiding harm to a minor’s physical or mental health). 
The bill requires controllers (entities that determine the purpose and 
means of processing personal data) with consumers under age 18 (minor 
consumers) to (1) use reasonable care to avoid causing any harm to the 
minor’s physical or mental health and (2) conduct a data protection 
assessment for any online service, product, or feature that addresses 
these harms and correct the risk. 
It requires any online service, product, or feature that includes direct 
messaging for minors to have a default setting that prevents adults from 
sending unsolicited communications to minors they are not connected 
to. 
EFFECTIVE DATE: October 1, 2025 
§ 1 — SOCIAL MEDIA PLATFOR M OWNER REQUIREMENTS 
Online Safety Center  
The bill requires each social media platform owner, starting January 
1, 2026, to incorporate an online safety center into the platform. Each 
online safety center must at least give consumers who use the platform: 
1. resources for (a) preventing cyberbullying on the platform and 
(b) enabling each consumer to identify any means to obtain 
mental health services, including a website address or telephone  2025HB-05474-R000184-BA.DOCX 
 
Researcher: JM 	Page 2 	3/24/25 
 
number to get mental health services to treat an anxiety disorder 
or suicide prevention; 
2. an explanation of the platform’s mechanism for reporting 
harmful or unwanted behavior, including cyberbullying on the 
platform; and 
3. educational information about the impact that social media 
platforms have on users’ mental health. 
The bill applies to consumers who live in the state and use the social 
media platform. 
Under law and the bill, a “social media platform” is a public or semi-
public internet service or application that:  
1. is used by a Connecticut consumer;  
2. is primarily intended to connect and allow users to socially 
interact within the service or application; and  
3. enables a user to (a) construct a public or semi-public profile for 
signing into and using the service or application; (b) populate a 
public list of other users with whom the user shares a social 
connection within the service or application; and (c) create or post 
content seen by other users, including on message boards, in chat 
rooms, or through a landing page or main feed that also provides 
the user with content from other users. 
A social media platform is not a public or semi-public internet service 
or application that:  
1. exclusively provides e-mail or direct messaging;  
2. primarily consists of news, sports, entertainment, interactive 
video games, electronic commerce, or content preselected by the 
provider or for which any chat, comments, or interactive 
functionality is incidental to, directly related to, or dependent on 
providing the content; or  2025HB-05474-R000184-BA.DOCX 
 
Researcher: JM 	Page 3 	3/24/25 
 
3. is used by and under an educational entity’s direction, including 
a learning management system or a student engagement 
program. 
Cyberbullying Policy 
The bill requires each social media platform owner, by January 1, 
2026, to establish a cyberbullying policy for the platform with a process 
for the owner to handle reports of cyberbullying on the platform. Under 
the bill, cyberbullying is any unwanted and aggressive behavior on a 
social media platform. 
§§ 2-4 — PROTECTIONS FOR MINO RS USING SOCIAL MEDIA  
The bill expands the Connecticut Data Privacy Act to (1) include 
additional factors for “heightened risk of harm,” (2) require certain 
default settings for direct messaging, and (3) explicitly prohibit design 
features that significantly increase usage. It also allows the attorney 
general to require controllers to disclose certain mitigation plans to him 
and creates an exception for educational entities for certain prohibited 
online features. 
As under existing law, these provisions apply to controllers that offer 
online services, products, or features to consumers for whom it has 
actual knowledge, or willfully disregards knowing, are minors. 
By law, an “online service, product, or feature” is any service, 
product, or feature provided online, but not any (1) telecommunications 
service, (2) broadband Internet access service, or (3) delivery or use of a 
physical product. 
Heightened Risk of Harm to Minors (§§ 2 & 4) 
Existing law requires a controller with minor consumers to use 
reasonable care to avoid causing any heightened risk of harm to minors 
in processing personal data. The bill broadens what constitutes a 
“heightened risk of harm to minors” to include any reasonably 
foreseeable risk of harm to the minor’s physical or mental health.  
As a result, the bill also requires controllers to perform additional 
data protection assessments for this new risk factor and make and  2025HB-05474-R000184-BA.DOCX 
 
Researcher: JM 	Page 4 	3/24/25 
 
implement a plan to mitigate or eliminate the risk. Existing law requires 
each controller with minor consumers to (1) perform a data protection 
assessment of its online service, product, or feature to address any 
heightened risk of harm to minors that is a reasonably foreseeable result 
of offering the online service, product, or feature to minors and (2) make 
and implement a plan to mitigate or eliminate the risk. 
The bill allows the attorney general to require a controller to disclose 
to him the plan if it is relevant to his investigation. 
Unsolicited Communications to Minors (§ 3) 
The bill prohibits controllers from offering any online service, 
product, or feature that includes direct messaging to minors unless it 
includes a default setting that prevents adults from sending unsolicited 
communications to minors that they are not connected to. Under current 
law, they only have to provide readily accessible and easy-to-use 
safeguards to limit the ability of adults to send unsolicited 
communications to minors with whom they are not connected. 
Features Designed to Increase Use (§ 3)  
Current law prohibits a controller from using any system design 
feature to significantly increase, sustain, or extend the use of an online 
service, product, or feature, without first getting the minor’s consent or, 
if the minor is younger than age 13, the minor’s parent or legal 
guardian’s consent. The bill prohibits this type of feature by removing 
the ability for someone to consent to the feature. 
Educational Exception (§ 3) 
The bill creates an exception for certain prohibited online services, 
products, or features for any service or application used by and under 
the direction of an educational entity, including a learning management 
system or a student engagement program. These prohibited actions 
include the direct messages and increased usage provisions described 
above and providing any consent mechanism designed to substantially 
subvert or impair, or manipulated with the effect of substantially 
subverting or impairing, user autonomy, decision-making, or choice.  2025HB-05474-R000184-BA.DOCX 
 
Researcher: JM 	Page 5 	3/24/25 
 
BACKGROUND 
Related Bill 
sHB 6857, favorably reported by the General Law Committee, among 
other requirements for platforms, requires a platform’s default setting 
to only allows users connected to the minor to view or respond to 
content the minor posts. 
COMMITTEE ACTION 
Committee on Children 
Joint Favorable 
Yea 15 Nay 2 (03/06/2025)