Massachusetts 2025-2026 Regular Session

Massachusetts House Bill H2236 Latest Draft

Bill / Introduced Version Filed 02/27/2025

                            1 of 1
HOUSE DOCKET, NO. 2051       FILED ON: 1/15/2025
HOUSE . . . . . . . . . . . . . . . No. 2236
The Commonwealth of Massachusetts
_________________
PRESENTED BY:
Joshua Tarsky
_________________
To the Honorable Senate and House of Representatives of the Commonwealth of Massachusetts in General
Court assembled:
The undersigned legislators and/or citizens respectfully petition for the adoption of the accompanying 
resolve:
Resolve relative to children’s mental health in social media.
_______________
PETITION OF:
NAME:DISTRICT/ADDRESS :DATE ADDED:Joshua Tarsky13th Norfolk1/15/2025 1 of 5
HOUSE DOCKET, NO. 2051       FILED ON: 1/15/2025
HOUSE . . . . . . . . . . . . . . . No. 2236
By Representative Tarsky of Needham, a petition (accompanied by resolve, House, No. 2236) of 
Joshua Tarsky for an investigation by a special commission (including members of the General 
Court) to promote safe social media use, identify best practices for social media platforms to 
safeguard children’s mental health, and develop guidelines for safe social media use. Mental 
Health, Substance Use and Recovery.
[SIMILAR MATTER FILED IN PREVIOUS SESSION
SEE HOUSE, NO. 1986 OF 2023-2024.]
The Commonwealth of Massachusetts
_______________
In the One Hundred and Ninety-Fourth General Court
(2025-2026)
_______________
Resolve relative to children’s mental health in social media.
1 Resolved, that there shall be a special commission on children’s mental health and social 
2media to investigate the risks of social media to children, recommend a legal framework for the 
3commonwealth to promote safe social media use, identify best practices for social media 
4platforms to safeguard children’s mental health, and develop guidelines for safe social media 
5use.
6 The commission shall consist of the following persons, or their designees: the secretary 
7of health and human services, who shall serve as chair; the commissioner of public health; the 
8commissioner of elementary and secondary education; the attorney general; 2 members of the 
9house of representatives, 1 of whom shall be appointed by the speaker of the house and 1 of 
10whom shall be appointed by the house minority leader; 2 members of senate, 1 of whom shall be  2 of 5
11appointed by the senate president and 1 of whom shall be appointed by the senate minority 
12leader; a representative of the Massachusetts chapter of the American Academy of Pediatrics; a 
13representative of the Children’s Mental Health Campaign; a representative of the Massachusetts 
14School Nurse Organization; a representative of the American Civil Liberties Union of 
15Massachusetts, Inc.; a representative of the Harvard TH Chan School of Public Health; and 10 
16persons to be appointed by the governor, 1 of whom shall be a representative of a nonprofit 
17organization that advocates for the prevention of online harms including, cyberbullying, sexual 
18exploitation and access to content that is harmful to children; 1 of whom shall be a representative 
19of a research or academic institution with experience 	in artificial intelligence and information 
20technology; 1 of whom shall be a representative of a behavioral health services program housed 
21at a community hospital; 1 of whom shall have experience in addiction; 1 of whom shall be 
22someone with clinical experience working with children; 1 of whom shall be an organization that 
23deals with child-targeted marketing; 1 of whom shall be a parent of a child who has experienced 
24cyberbulling and has engaged a school system regarding cyberbullying; 1 of whom shall be a 
25youth who has experienced cyberbulling by their peers; 1 of whom shall have experience 
26working for a social media platform; and 1 of whom shall have experience conducting 
27independent audits of social media platforms and social media algorithms.
28 The commission shall:
29 (i) investigate, assess, advise and report on the risk of harm or actual harms children 
30encounter on social media platforms, including but not limited to:
31 (1) the effect of social media on children’s mental health, including but not limited to the 
32promotion or exacerbation of self-harm, suicide, eating disorders, addiction and substance use  3 of 5
33disorder; physical violence, online bullying and harassment; sexual exploitation, including 
34enticement, sex trafficking, and sexual abuse of minors and trafficking of online child sexual 
35abuse material; promotion and marketing of narcotic drugs, tobacco products, gambling, or 
36alcohol to a child; or predatory, unfair, or deceptive marketing practices, or other financial 
37harms; and
38 (2) the use and impact on children of online design features that increase, sustain, or 
39extend use of covered platforms, such as the automatic playing of media, the use of infinite 
40scrolling, rewards for time spent, paid ad placement and notifications;
41 (ii) study, review, advise and recommend a legal framework for the commonwealth to 
42receive and review independent algorithm audits of social media platforms likely to be accessed 
43by children, including but not limited to:
44 (1) the information that a state agency may require a social media platform submit in an 
45independent algorithm audit, including but not limited to: (A) transparency audits; (B) system 
46risk assessments; and (C) mitigation accounting and planning;
47 (2) criteria for vetting and approving professional auditors to conduct independent third-
48party algorithm audits of social media platforms; 
49 (3) criteria for defining what social media platforms may be reasonably required to 
50submit an independent algorithm audit;
51 (4) organizational and fiscal models that would ensure effective operations of a state 
52agency tasked with receiving and reviewing independent algorithm audits; and 4 of 5
53 (5) definitions for key terms not already defined in the General Laws, including 
54algorithm, social media platform, likely to be accessed;
55 (iii) identify best practices social media platforms may implement to promote the health 
56and safety of children using social media platforms, including but not limited to:
57 (1) acceptable standards for the mitigation or elimination of harms children may 
58encounter on social media platforms; 
59 (2) methods to ensure privacy in age verification; 
60 (3) data management best practices to mitigate the unauthorized access of a child’s 
61personal information or data; and
62 (4) tools and features that social media platforms may provide to prevent children from 
63harm while using the covered platform and that ensure the privacy of children, especially the 
64privacy of children between the ages of 13 and 18;
65 (iv) Develop recommendations to encourage the safe social media use of children, 
66including: 
67 (1) recommendations for parents relative to the safe use of social media platforms among 
68children;
69 (2) model guidelines for school districts on the use of social media by students during 
70school hours and how to address issues that arise inside schools due to the use of social media 
71outside of school hours; and 5 of 5
72 (3) public awareness campaigns the department of public health may conduct to promote 
73the above recommendations.
74 The commission may solicit public input through public hearings and testimony.
75 Not later than December 31, 2026, the commission shall submit a detailed report with its 
76findings and recommendations, along with drafts of legislation necessary to carry out its 
77recommendations, to the governor, the joint committee on public health; the joint committee on 
78mental health, substance use and recovery; the joint committee on children, families and persons 
79with disabilities; and the house and senate committee on ways and means.