Massachusetts 2025-2026 Regular Session

Massachusetts Senate Bill S51 Compare Versions

Only one version of the bill is available at this time.
OldNewDifferences
11 1 of 1
22 SENATE DOCKET, NO. 1313 FILED ON: 1/16/2025
33 SENATE . . . . . . . . . . . . . . No. 51
44 The Commonwealth of Massachusetts
55 _________________
66 PRESENTED BY:
77 John C. Velis
88 _________________
99 To the Honorable Senate and House of Representatives of the Commonwealth of Massachusetts in General
1010 Court assembled:
1111 The undersigned legislators and/or citizens respectfully petition for the adoption of the accompanying bill:
1212 An Act relative to social media, algorithm accountability, and transparency.
1313 _______________
1414 PETITION OF:
1515 NAME:DISTRICT/ADDRESS :John C. VelisHampden and Hampshire 1 of 8
1616 SENATE DOCKET, NO. 1313 FILED ON: 1/16/2025
1717 SENATE . . . . . . . . . . . . . . No. 51
1818 By Mr. Velis, a petition (accompanied by bill, Senate, No. 51) of John C. Velis for legislation
1919 relative to social media, algorithm accountability, and transparency. Advanced Information
2020 Technology, the Internet and Cybersecurity.
2121 The Commonwealth of Massachusetts
2222 _______________
2323 In the One Hundred and Ninety-Fourth General Court
2424 (2025-2026)
2525 _______________
2626 An Act relative to social media, algorithm accountability, and transparency.
2727 Be it enacted by the Senate and House of Representatives in General Court assembled, and by the authority
2828 of the same, as follows:
2929 1 SECTION 1. Chapter 12 of the General Laws, as so appearing, is hereby amended by
3030 2inserting after section 35 the following section:-
3131 3 Section 36. (a) As used in this section the following words shall, unless the context
3232 4clearly requires otherwise, have the following meanings:-
3333 5 “Algorithm”, computational process that uses machine learning, natural language
3434 6processing, artificial intelligence techniques, or other computational processing techniques of
3535 7similar or greater complexity and that makes a decision or facilitates human decision-making
3636 8with respect to users personal information, including to determine the provision of products or
3737 9services or to rank, order, promote, recommend, amplify or similarly determine the delivery or
3838 10display of information to an individual. For purposes of this section, an algorithm will refer to
3939 11recommendation algorithms, also known as engagement-based algorithms, which passively
4040 12populate a user’s feed or experience with content without any direct action or request by the user. 2 of 8
4141 13 “Children”, consumers under 18 years of age.
4242 14 “Covered platform”, an internet website, online service, online application, or mobile
4343 15application, including, but not limited to, a social media platform that conducts business in this
4444 16state or that produces products or services that is accessed by residents and that during the
4545 17preceding calendar year: (1) controlled or processed the personal information of not less than one
4646 18hundred thousand consumers, excluding personal information controlled or processed solely for
4747 19the purpose of completing a payment transaction; or (2) controlled or processed the personal
4848 20information of not less than twenty-five thousand consumers and derived more than twenty-five
4949 21per cent of their gross revenue from the sale of personal data.
5050 22 “Consumer”, a natural person who is a Massachusetts resident, however identified,
5151 23including by any unique identifier.
5252 24 “Independent third-party auditor”, auditing organization that has no affiliation with a
5353 25covered platform as defined by this section.
5454 26 “Likely to be accessed”, reasonable expectation, based on the following factors, that a
5555 27covered platform would be accessed by children: (1) the covered platform is directed to children
5656 28as defined by the Children’s Online Privacy Protection Act (15 U.S.C. Sec. 6501 et seq.); (2) the
5757 29covered platform is determined based on audience composition where children comprise at least
5858 3010% of its audience; (3) the covered platform is paid for advertisements on its platform that are
5959 31marketed to children; (4) the covered platform is substantially similar or the same as a covered
6060 32platform that satisfies subsection (2); and (5) a significant amount of the audience of the covered
6161 33platform, 10% or more, is determined, based on internal company research, to be children. 3 of 8
6262 34 "Process" or "processing", any operation or set of operations performed, whether by
6363 35manual or automated means, on personal information or on sets of personal information, such as
6464 36the collection, use, storage, disclosure, analysis, deletion or modification of personal
6565 37information.
6666 38 “Personal information”, information linked or reasonably linkable to an identified or
6767 39identifiable individual.
6868 40 “Social media platform”, public or semipublic internet-based service or application that
6969 41has users in Massachusetts and that meets both of the following criteria: (1) a substantial
7070 42function of the service or application is to connect users and allow users to interact socially with
7171 43each other within the service or application; provided further, that an internet-based service or
7272 44application that provides email or direct messaging services shall not be considered to meet this
7373 45criterion on this function alone; provided further that a service or application that is an internet
7474 46search engine or website whose primary focus is e-commerce, which would include the buying,
7575 47selling, or exchange of goods or services over the internet, including business-to-business,
7676 48business-to-consumer, and consumer-to-consumer transactions shall not be considered to meet
7777 49this criterion on the basis of that function alone; and (2) the application allows users to: (i)
7878 50construct a public or semipublic profile for purposes of signing into and using the service or
7979 51application; (ii) populate a list of other users with whom an individual shares a social connection
8080 52within the system; and (iii) create or post content viewable by other users, including, but not
8181 53limited to, on message boards, in chat rooms, or through a landing page or main feed that
8282 54presents the user with content generated by other users. 4 of 8
8383 55 “Experts in the mental health and public policy fields”, (1) academic experts, health
8484 56professionals, and members of civil society with expertise in mental health, substance use
8585 57disorders, and the prevention of harms to minors; (2) representatives in academia and civil
8686 58society with specific expertise in privacy and civil liberties; (3) parents and youth representation;
8787 59(4) representatives of the national telecommunications and information administration, the
8888 60national institute of standards and technology, the federal trade commission, the office of the
8989 61attorney general of Massachusetts, and the Massachusetts executive office of health and human
9090 62services; (5) state attorneys general or their designees acting in State or local government; and
9191 63(6) representatives of communities of socially disadvantaged individuals as defined in section 8
9292 64of the Small Business Act, 15 U.S.C. 637.
9393 65 (b) There shall be an office of social media transparency and accountability, which shall
9494 66be supervised and controlled by the office of the attorney general. The office shall receive,
9595 67review and maintain the reports from covered platforms, to enforce this section, and to adopt
9696 68regulations to clarify the requirements of this section.
9797 69 (c) Annually before January 1, covered platforms shall register with the office by
9898 70providing: (i) a registration fee, determined by the office of the attorney general; (ii) the
9999 71platform’s name; (iii) physical address; (iv) email; and (v) internet address.
100100 72 (d) The office shall compile a list of approved, independent third-party auditors and shall
101101 73assign independent third-party auditors to conduct algorithm risk audits of covered platforms.
102102 74Risk audits shall be conducted monthly by third-party auditors, unless specified otherwise by the
103103 75office. Audits and associated costs shall be paid for by covered platforms. The algorithm risk
104104 76audits shall focus on harms to children, including but not limited to: (i) mental health disorders 5 of 8
105105 77including anxiety, depression, eating disorders, substance abuse disorders, and suicidal
106106 78behaviors; (ii) patterns of use that indicate or encourage addiction-like behaviors; (iii) physical
107107 79violence, online bullying, and harassment of the minor; (iv) sexual exploitation and abuse; (v)
108108 80promoting and market of narcotic drugs as defined in section 102 of the Controlled Substances
109109 81Act, 21 U.S.C. 802, tobacco products, gambling, or alcohol; and (vi) predatory, unfair or
110110 82deceptive marketing practices, or other financial harms.
111111 83 (e) Annually before January 1, the office shall empanel an Advisory Council of experts in
112112 84the mental health and public policy fields as defined in this section to review these harms and
113113 85identify additional ways covered platforms cause harms to children.
114114 86 (f) Annually before July 1, the office shall promulgate regulations based on the
115115 87cumulation of the potential harms identified by the Advisory Council that update the specific
116116 88harms that must be examined by the algorithm risk audits required under this section.
117117 89 (g) Beginning on January 1, 2026, covered platforms shall annually submit transparency
118118 90reports to the office containing, but not limited to: (i) assessment of whether the covered
119119 91platform is likely to be accessed by children; (ii) description of the covered platform’s
120120 92commercial interests in use of the platform by children; (iii) number of individuals using the
121121 93covered platform reasonably believed to be children in the United States, disaggregated by the
122122 94age ranges of 0-5, 6-9, 10-12, 13-15 and 16-17 years; (iv) median and mean amounts of time
123123 95spent on the covered platform by children in the United States who have accessed the platform
124124 96during the reporting year on a daily, weekly and monthly basis, disaggregated by the age ranges
125125 97of 0-5, 6-9, 10-12, 13-15 and 16-17 years; (v) description of whether and how the covered
126126 98platform uses system design features to increase, sustain, or extend use of a product or service by 6 of 8
127127 99users, including automatic playing of media, rewards for time spent and notifications; (vi)
128128 100description of whether, how and for what purpose the covered platform collects or processes
129129 101personal information that may cause reasonably foreseeable risk of harm to children; (vii) total
130130 102number of complaints received regarding, and the prevalence of issues related to, the harms
131131 103described in section 1, disaggregated by category of harm; (viii) description of the mechanism by
132132 104which the public may submit complaints, the internal processes for handling complaints, and any
133133 105automated detection mechanisms for harms to children, including the rate, timeliness, and
134134 106effectiveness of responses.
135135 107 (h) By January 1, 2027, covered platforms shall submit preliminary reports to the office.
136136 108The preliminary report must measure the incidence of each of the specific harms identified in
137137 109subsection (d) that occur on the covered platform. The office must consult with independent
138138 110third-party auditors and covered platforms to determine what data shall be used to produce the
139139 111preliminary reports.
140140 112 After a covered platform has submitted a preliminary report, the covered platform may
141141 113agree that the office will consult with independent third-party auditors and the covered platform
142142 114to set benchmarks the covered platform must meet to reduce the harms, identified in subsection
143143 115(d) on its platform as indicated in the preliminary reports required under this section. Upon
144144 116agreement, each covered platform shall thereafter produce biannual reports containing, but not
145145 117limited to: (i) steps taken to mitigate harm on its platform, including implementation of any
146146 118systems used to meet benchmarks; and (ii) measurements indicating the redaction in harm as a
147147 119result of these systems. 7 of 8
148148 120 In the case the covered platform has failed meet the benchmarks, upon agreement its
149149 121annual report must also contain: (1) a mitigation plan detailing changes the platform intends to
150150 122take to ensure future compliance with benchmarks; and (2) a written explanation regarding the
151151 123reasons the benchmarks were not met.
152152 124 If a covered platform should choose not to consult with independent third-party auditors
153153 125to set benchmarks it must meet to reduce the harms, identified in subsection (d), on its platform
154154 126as indicated in the preliminary reports required under this subsection, the attorney general is not
155155 127precluded from pursuing any other legal remedy available at law to mitigate harms.
156156 128 (i) The records generated by this section shall be subject to chapter 66 of the General
157157 129Laws and shall be made accessible to the public on the attorney general’s website. However, to
158158 130the extent any information contained within a report required by this section is trade secret,
159159 131proprietary or privileged, covered platforms may request such information be redacted from the
160160 132copy of the report that is obtainable under the public records law and on the attorney general’s
161161 133website. The office will conduct a confidential, in-camera review of requested redactions to
162162 134determine whether the information is trade secret, proprietary or privileged information that
163163 135should not be made accessible for public review. All information from the copy of the report
164164 136submitted to the office, including redactions, will be maintained by a covered platform in their
165165 137internal records.
166166 138 (j) A covered platform shall be considered in violation of this section for the following:
167167 139(i) fails to register with the office; (ii) materially omits or misrepresents required information in a
168168 140submitted report; or (iii) fails to timely submit a report to the office. 8 of 8
169169 141 (1) A covered platform in violation of this section is subject to an injunction and liable
170170 142for a civil penalty not to exceed $500,000 per violation, which shall be assessed and recovered in
171171 143a civil action brought by the attorney general. In assessing the amount of a civil penalty pursuant
172172 144to this section, the court shall consider whether the covered platform made a reasonable, good
173173 145faith attempt to comply with the provisions of this section. Any penalties, fees, and expenses
174174 146recovered in an action brought under this section shall be collected by the office of the attorney
175175 147general with the intent that they be used to fully offset costs in connection with the enforcement
176176 148of this section and to promote the positive mental health outcomes of the children of
177177 149Massachusetts.
178178 150 (k) If any provision of this section, or any application of such provision to any person or
179179 151circumstance, is held to be unconstitutional, the remainder of this section and the application of
180180 152this section to any other person or circumstance shall not be affected.