1 | 1 | | 1 of 1 |
---|
2 | 2 | | SENATE DOCKET, NO. 1313 FILED ON: 1/16/2025 |
---|
3 | 3 | | SENATE . . . . . . . . . . . . . . No. 51 |
---|
4 | 4 | | The Commonwealth of Massachusetts |
---|
5 | 5 | | _________________ |
---|
6 | 6 | | PRESENTED BY: |
---|
7 | 7 | | John C. Velis |
---|
8 | 8 | | _________________ |
---|
9 | 9 | | To the Honorable Senate and House of Representatives of the Commonwealth of Massachusetts in General |
---|
10 | 10 | | Court assembled: |
---|
11 | 11 | | The undersigned legislators and/or citizens respectfully petition for the adoption of the accompanying bill: |
---|
12 | 12 | | An Act relative to social media, algorithm accountability, and transparency. |
---|
13 | 13 | | _______________ |
---|
14 | 14 | | PETITION OF: |
---|
15 | 15 | | NAME:DISTRICT/ADDRESS :John C. VelisHampden and Hampshire 1 of 8 |
---|
16 | 16 | | SENATE DOCKET, NO. 1313 FILED ON: 1/16/2025 |
---|
17 | 17 | | SENATE . . . . . . . . . . . . . . No. 51 |
---|
18 | 18 | | By Mr. Velis, a petition (accompanied by bill, Senate, No. 51) of John C. Velis for legislation |
---|
19 | 19 | | relative to social media, algorithm accountability, and transparency. Advanced Information |
---|
20 | 20 | | Technology, the Internet and Cybersecurity. |
---|
21 | 21 | | The Commonwealth of Massachusetts |
---|
22 | 22 | | _______________ |
---|
23 | 23 | | In the One Hundred and Ninety-Fourth General Court |
---|
24 | 24 | | (2025-2026) |
---|
25 | 25 | | _______________ |
---|
26 | 26 | | An Act relative to social media, algorithm accountability, and transparency. |
---|
27 | 27 | | Be it enacted by the Senate and House of Representatives in General Court assembled, and by the authority |
---|
28 | 28 | | of the same, as follows: |
---|
29 | 29 | | 1 SECTION 1. Chapter 12 of the General Laws, as so appearing, is hereby amended by |
---|
30 | 30 | | 2inserting after section 35 the following section:- |
---|
31 | 31 | | 3 Section 36. (a) As used in this section the following words shall, unless the context |
---|
32 | 32 | | 4clearly requires otherwise, have the following meanings:- |
---|
33 | 33 | | 5 “Algorithm”, computational process that uses machine learning, natural language |
---|
34 | 34 | | 6processing, artificial intelligence techniques, or other computational processing techniques of |
---|
35 | 35 | | 7similar or greater complexity and that makes a decision or facilitates human decision-making |
---|
36 | 36 | | 8with respect to users personal information, including to determine the provision of products or |
---|
37 | 37 | | 9services or to rank, order, promote, recommend, amplify or similarly determine the delivery or |
---|
38 | 38 | | 10display of information to an individual. For purposes of this section, an algorithm will refer to |
---|
39 | 39 | | 11recommendation algorithms, also known as engagement-based algorithms, which passively |
---|
40 | 40 | | 12populate a user’s feed or experience with content without any direct action or request by the user. 2 of 8 |
---|
41 | 41 | | 13 “Children”, consumers under 18 years of age. |
---|
42 | 42 | | 14 “Covered platform”, an internet website, online service, online application, or mobile |
---|
43 | 43 | | 15application, including, but not limited to, a social media platform that conducts business in this |
---|
44 | 44 | | 16state or that produces products or services that is accessed by residents and that during the |
---|
45 | 45 | | 17preceding calendar year: (1) controlled or processed the personal information of not less than one |
---|
46 | 46 | | 18hundred thousand consumers, excluding personal information controlled or processed solely for |
---|
47 | 47 | | 19the purpose of completing a payment transaction; or (2) controlled or processed the personal |
---|
48 | 48 | | 20information of not less than twenty-five thousand consumers and derived more than twenty-five |
---|
49 | 49 | | 21per cent of their gross revenue from the sale of personal data. |
---|
50 | 50 | | 22 “Consumer”, a natural person who is a Massachusetts resident, however identified, |
---|
51 | 51 | | 23including by any unique identifier. |
---|
52 | 52 | | 24 “Independent third-party auditor”, auditing organization that has no affiliation with a |
---|
53 | 53 | | 25covered platform as defined by this section. |
---|
54 | 54 | | 26 “Likely to be accessed”, reasonable expectation, based on the following factors, that a |
---|
55 | 55 | | 27covered platform would be accessed by children: (1) the covered platform is directed to children |
---|
56 | 56 | | 28as defined by the Children’s Online Privacy Protection Act (15 U.S.C. Sec. 6501 et seq.); (2) the |
---|
57 | 57 | | 29covered platform is determined based on audience composition where children comprise at least |
---|
58 | 58 | | 3010% of its audience; (3) the covered platform is paid for advertisements on its platform that are |
---|
59 | 59 | | 31marketed to children; (4) the covered platform is substantially similar or the same as a covered |
---|
60 | 60 | | 32platform that satisfies subsection (2); and (5) a significant amount of the audience of the covered |
---|
61 | 61 | | 33platform, 10% or more, is determined, based on internal company research, to be children. 3 of 8 |
---|
62 | 62 | | 34 "Process" or "processing", any operation or set of operations performed, whether by |
---|
63 | 63 | | 35manual or automated means, on personal information or on sets of personal information, such as |
---|
64 | 64 | | 36the collection, use, storage, disclosure, analysis, deletion or modification of personal |
---|
65 | 65 | | 37information. |
---|
66 | 66 | | 38 “Personal information”, information linked or reasonably linkable to an identified or |
---|
67 | 67 | | 39identifiable individual. |
---|
68 | 68 | | 40 “Social media platform”, public or semipublic internet-based service or application that |
---|
69 | 69 | | 41has users in Massachusetts and that meets both of the following criteria: (1) a substantial |
---|
70 | 70 | | 42function of the service or application is to connect users and allow users to interact socially with |
---|
71 | 71 | | 43each other within the service or application; provided further, that an internet-based service or |
---|
72 | 72 | | 44application that provides email or direct messaging services shall not be considered to meet this |
---|
73 | 73 | | 45criterion on this function alone; provided further that a service or application that is an internet |
---|
74 | 74 | | 46search engine or website whose primary focus is e-commerce, which would include the buying, |
---|
75 | 75 | | 47selling, or exchange of goods or services over the internet, including business-to-business, |
---|
76 | 76 | | 48business-to-consumer, and consumer-to-consumer transactions shall not be considered to meet |
---|
77 | 77 | | 49this criterion on the basis of that function alone; and (2) the application allows users to: (i) |
---|
78 | 78 | | 50construct a public or semipublic profile for purposes of signing into and using the service or |
---|
79 | 79 | | 51application; (ii) populate a list of other users with whom an individual shares a social connection |
---|
80 | 80 | | 52within the system; and (iii) create or post content viewable by other users, including, but not |
---|
81 | 81 | | 53limited to, on message boards, in chat rooms, or through a landing page or main feed that |
---|
82 | 82 | | 54presents the user with content generated by other users. 4 of 8 |
---|
83 | 83 | | 55 “Experts in the mental health and public policy fields”, (1) academic experts, health |
---|
84 | 84 | | 56professionals, and members of civil society with expertise in mental health, substance use |
---|
85 | 85 | | 57disorders, and the prevention of harms to minors; (2) representatives in academia and civil |
---|
86 | 86 | | 58society with specific expertise in privacy and civil liberties; (3) parents and youth representation; |
---|
87 | 87 | | 59(4) representatives of the national telecommunications and information administration, the |
---|
88 | 88 | | 60national institute of standards and technology, the federal trade commission, the office of the |
---|
89 | 89 | | 61attorney general of Massachusetts, and the Massachusetts executive office of health and human |
---|
90 | 90 | | 62services; (5) state attorneys general or their designees acting in State or local government; and |
---|
91 | 91 | | 63(6) representatives of communities of socially disadvantaged individuals as defined in section 8 |
---|
92 | 92 | | 64of the Small Business Act, 15 U.S.C. 637. |
---|
93 | 93 | | 65 (b) There shall be an office of social media transparency and accountability, which shall |
---|
94 | 94 | | 66be supervised and controlled by the office of the attorney general. The office shall receive, |
---|
95 | 95 | | 67review and maintain the reports from covered platforms, to enforce this section, and to adopt |
---|
96 | 96 | | 68regulations to clarify the requirements of this section. |
---|
97 | 97 | | 69 (c) Annually before January 1, covered platforms shall register with the office by |
---|
98 | 98 | | 70providing: (i) a registration fee, determined by the office of the attorney general; (ii) the |
---|
99 | 99 | | 71platform’s name; (iii) physical address; (iv) email; and (v) internet address. |
---|
100 | 100 | | 72 (d) The office shall compile a list of approved, independent third-party auditors and shall |
---|
101 | 101 | | 73assign independent third-party auditors to conduct algorithm risk audits of covered platforms. |
---|
102 | 102 | | 74Risk audits shall be conducted monthly by third-party auditors, unless specified otherwise by the |
---|
103 | 103 | | 75office. Audits and associated costs shall be paid for by covered platforms. The algorithm risk |
---|
104 | 104 | | 76audits shall focus on harms to children, including but not limited to: (i) mental health disorders 5 of 8 |
---|
105 | 105 | | 77including anxiety, depression, eating disorders, substance abuse disorders, and suicidal |
---|
106 | 106 | | 78behaviors; (ii) patterns of use that indicate or encourage addiction-like behaviors; (iii) physical |
---|
107 | 107 | | 79violence, online bullying, and harassment of the minor; (iv) sexual exploitation and abuse; (v) |
---|
108 | 108 | | 80promoting and market of narcotic drugs as defined in section 102 of the Controlled Substances |
---|
109 | 109 | | 81Act, 21 U.S.C. 802, tobacco products, gambling, or alcohol; and (vi) predatory, unfair or |
---|
110 | 110 | | 82deceptive marketing practices, or other financial harms. |
---|
111 | 111 | | 83 (e) Annually before January 1, the office shall empanel an Advisory Council of experts in |
---|
112 | 112 | | 84the mental health and public policy fields as defined in this section to review these harms and |
---|
113 | 113 | | 85identify additional ways covered platforms cause harms to children. |
---|
114 | 114 | | 86 (f) Annually before July 1, the office shall promulgate regulations based on the |
---|
115 | 115 | | 87cumulation of the potential harms identified by the Advisory Council that update the specific |
---|
116 | 116 | | 88harms that must be examined by the algorithm risk audits required under this section. |
---|
117 | 117 | | 89 (g) Beginning on January 1, 2026, covered platforms shall annually submit transparency |
---|
118 | 118 | | 90reports to the office containing, but not limited to: (i) assessment of whether the covered |
---|
119 | 119 | | 91platform is likely to be accessed by children; (ii) description of the covered platform’s |
---|
120 | 120 | | 92commercial interests in use of the platform by children; (iii) number of individuals using the |
---|
121 | 121 | | 93covered platform reasonably believed to be children in the United States, disaggregated by the |
---|
122 | 122 | | 94age ranges of 0-5, 6-9, 10-12, 13-15 and 16-17 years; (iv) median and mean amounts of time |
---|
123 | 123 | | 95spent on the covered platform by children in the United States who have accessed the platform |
---|
124 | 124 | | 96during the reporting year on a daily, weekly and monthly basis, disaggregated by the age ranges |
---|
125 | 125 | | 97of 0-5, 6-9, 10-12, 13-15 and 16-17 years; (v) description of whether and how the covered |
---|
126 | 126 | | 98platform uses system design features to increase, sustain, or extend use of a product or service by 6 of 8 |
---|
127 | 127 | | 99users, including automatic playing of media, rewards for time spent and notifications; (vi) |
---|
128 | 128 | | 100description of whether, how and for what purpose the covered platform collects or processes |
---|
129 | 129 | | 101personal information that may cause reasonably foreseeable risk of harm to children; (vii) total |
---|
130 | 130 | | 102number of complaints received regarding, and the prevalence of issues related to, the harms |
---|
131 | 131 | | 103described in section 1, disaggregated by category of harm; (viii) description of the mechanism by |
---|
132 | 132 | | 104which the public may submit complaints, the internal processes for handling complaints, and any |
---|
133 | 133 | | 105automated detection mechanisms for harms to children, including the rate, timeliness, and |
---|
134 | 134 | | 106effectiveness of responses. |
---|
135 | 135 | | 107 (h) By January 1, 2027, covered platforms shall submit preliminary reports to the office. |
---|
136 | 136 | | 108The preliminary report must measure the incidence of each of the specific harms identified in |
---|
137 | 137 | | 109subsection (d) that occur on the covered platform. The office must consult with independent |
---|
138 | 138 | | 110third-party auditors and covered platforms to determine what data shall be used to produce the |
---|
139 | 139 | | 111preliminary reports. |
---|
140 | 140 | | 112 After a covered platform has submitted a preliminary report, the covered platform may |
---|
141 | 141 | | 113agree that the office will consult with independent third-party auditors and the covered platform |
---|
142 | 142 | | 114to set benchmarks the covered platform must meet to reduce the harms, identified in subsection |
---|
143 | 143 | | 115(d) on its platform as indicated in the preliminary reports required under this section. Upon |
---|
144 | 144 | | 116agreement, each covered platform shall thereafter produce biannual reports containing, but not |
---|
145 | 145 | | 117limited to: (i) steps taken to mitigate harm on its platform, including implementation of any |
---|
146 | 146 | | 118systems used to meet benchmarks; and (ii) measurements indicating the redaction in harm as a |
---|
147 | 147 | | 119result of these systems. 7 of 8 |
---|
148 | 148 | | 120 In the case the covered platform has failed meet the benchmarks, upon agreement its |
---|
149 | 149 | | 121annual report must also contain: (1) a mitigation plan detailing changes the platform intends to |
---|
150 | 150 | | 122take to ensure future compliance with benchmarks; and (2) a written explanation regarding the |
---|
151 | 151 | | 123reasons the benchmarks were not met. |
---|
152 | 152 | | 124 If a covered platform should choose not to consult with independent third-party auditors |
---|
153 | 153 | | 125to set benchmarks it must meet to reduce the harms, identified in subsection (d), on its platform |
---|
154 | 154 | | 126as indicated in the preliminary reports required under this subsection, the attorney general is not |
---|
155 | 155 | | 127precluded from pursuing any other legal remedy available at law to mitigate harms. |
---|
156 | 156 | | 128 (i) The records generated by this section shall be subject to chapter 66 of the General |
---|
157 | 157 | | 129Laws and shall be made accessible to the public on the attorney general’s website. However, to |
---|
158 | 158 | | 130the extent any information contained within a report required by this section is trade secret, |
---|
159 | 159 | | 131proprietary or privileged, covered platforms may request such information be redacted from the |
---|
160 | 160 | | 132copy of the report that is obtainable under the public records law and on the attorney general’s |
---|
161 | 161 | | 133website. The office will conduct a confidential, in-camera review of requested redactions to |
---|
162 | 162 | | 134determine whether the information is trade secret, proprietary or privileged information that |
---|
163 | 163 | | 135should not be made accessible for public review. All information from the copy of the report |
---|
164 | 164 | | 136submitted to the office, including redactions, will be maintained by a covered platform in their |
---|
165 | 165 | | 137internal records. |
---|
166 | 166 | | 138 (j) A covered platform shall be considered in violation of this section for the following: |
---|
167 | 167 | | 139(i) fails to register with the office; (ii) materially omits or misrepresents required information in a |
---|
168 | 168 | | 140submitted report; or (iii) fails to timely submit a report to the office. 8 of 8 |
---|
169 | 169 | | 141 (1) A covered platform in violation of this section is subject to an injunction and liable |
---|
170 | 170 | | 142for a civil penalty not to exceed $500,000 per violation, which shall be assessed and recovered in |
---|
171 | 171 | | 143a civil action brought by the attorney general. In assessing the amount of a civil penalty pursuant |
---|
172 | 172 | | 144to this section, the court shall consider whether the covered platform made a reasonable, good |
---|
173 | 173 | | 145faith attempt to comply with the provisions of this section. Any penalties, fees, and expenses |
---|
174 | 174 | | 146recovered in an action brought under this section shall be collected by the office of the attorney |
---|
175 | 175 | | 147general with the intent that they be used to fully offset costs in connection with the enforcement |
---|
176 | 176 | | 148of this section and to promote the positive mental health outcomes of the children of |
---|
177 | 177 | | 149Massachusetts. |
---|
178 | 178 | | 150 (k) If any provision of this section, or any application of such provision to any person or |
---|
179 | 179 | | 151circumstance, is held to be unconstitutional, the remainder of this section and the application of |
---|
180 | 180 | | 152this section to any other person or circumstance shall not be affected. |
---|