Vermont 2025-2026 Regular Session

Vermont House Bill H0340 Compare Versions

Only one version of the bill is available at this time.
OldNewDifferences
11 BILL AS INTRODUCED H.340
22 2025 Page 1 of 23
33
44
55 VT LEG #378965 v.1
66 H.340 1
77 Introduced by Representatives Priestley of Bradford, Arsenault of Williston, 2
88 Berbeco of Winooski, Cole of Hartford, Logan of Burlington, 3
99 Masland of Thetford, McGill of Bridport, Sibilia of Dover, and 4
1010 White of Waitsfield 5
1111 Referred to Committee on 6
1212 Date: 7
1313 Subject: Commerce and trade; consumer protection; artificial intelligence 8
1414 Statement of purpose of bill as introduced: This bill proposes to regulate 9
1515 developers and deployers of automated decision systems used in consequential 10
1616 decisions in an effort to avoid algorithmic discrimination towards consumers. 11
1717 An act relating to regulating developers and deployers of certain automated 12
1818 decision systems 13
1919 It is hereby enacted by the General Assembly of the State of Vermont: 14
2020 Sec. 1. 9 V.S.A. chapter 118 is added to read: 15
2121 CHAPTER 118. ARTIFICIAL INTELLIGENCE 16
2222 Subchapter 1. Algorithmic Discrimination and Automated Decision Systems 17
2323 § 4193a. DEFINITIONS 18
2424 As used in this subchapter: 19 BILL AS INTRODUCED H.340
2525 2025 Page 2 of 23
2626
2727
2828 VT LEG #378965 v.1
2929 (1)(A) “Algorithmic discrimination” means any condition in which the 1
3030 use of an automated decision system results in a differential treatment or 2
3131 impact that disfavors an individual on the basis of the individual’s actual or 3
3232 perceived age, color, disability, ethnicity, genetic information, immigration or 4
3333 citizenship status, limited proficiency in the English language, national origin, 5
3434 race, religion, reproductive health, sex, sexual orientation, gender identity, 6
3535 veteran status, or other classification protected under the laws of this State or 7
3636 federal law. 8
3737 (B) “Algorithmic discrimination” does not include: 9
3838 (i) a developer’s or deployer’s testing of the developer’s or 10
3939 deployer’s own automated decision system to identify, mitigate, and prevent 11
4040 discrimination; 12
4141 (ii) expanding an applicant, customer, or participant pool to 13
4242 increase diversity or redress historical discrimination; or 14
4343 (iii) an act or omission by or on behalf of a private club or other 15
4444 establishment that is not in fact open to the public, as set forth in Title II of the 16
4545 federal Civil Rights Act of 1964, 42 U.S.C.§ 2000a(e), as amended. 17
4646 (2) “Auditor” refers to an independent entity, including an individual, a 18
4747 nonprofit, a firm, a corporation, a partnership, a cooperative, or an association, 19
4848 commissioned to perform an audit. 20 BILL AS INTRODUCED H.340
4949 2025 Page 3 of 23
5050
5151
5252 VT LEG #378965 v.1
5353 (3)(A) “Automated decision system” means a computational process 1
5454 derived from machine learning, statistical modeling, data analytics, or artificial 2
5555 intelligence that issues an output, including a score, classification, or 3
5656 recommendation. 4
5757 (B) “Automated decision system” does not include any software used 5
5858 primarily for basic computerized processes, such as antimalware, antivirus, 6
5959 autocorrect functions, calculators, databases, data storage, electronic 7
6060 communications, firewall, internet domain registration, website loading, 8
6161 networking, spam and robocall filtering, spellcheck tools, spreadsheets, web 9
6262 caching, web hosting, or any tool that relates only to nonemployment internal 10
6363 management affairs such as ordering office supplies or processing payments, 11
6464 and that do not materially affect the rights, liberties, benefits, safety, or welfare 12
6565 of any individual within the State. 13
6666 (4) “Consequential decision” means a decision that has a material, legal, 14
6767 or similarly significant effect on the provision or denial to any consumer of, or 15
6868 the cost, terms, or availability of: 16
6969 (A) educational and vocational training, including: 17
7070 (i) assessment or grading, including detecting student cheating or 18
7171 plagiarism; 19
7272 (ii) accreditation; 20
7373 (iii) certification; 21 BILL AS INTRODUCED H.340
7474 2025 Page 4 of 23
7575
7676
7777 VT LEG #378965 v.1
7878 (iv) admissions or enrollment; and 1
7979 (v) financial aid or scholarships; 2
8080 (B) employment or an employment opportunity, including: 3
8181 (i) pay or promotion; 4
8282 (ii) hiring or termination; and 5
8383 (iii) automated task allocation; 6
8484 (C) housing or lodging, including long-term or short-term rentals; 7
8585 (D) essential utilities, including electricity, heat, water, internet or 8
8686 telecommunications access, or transportation; 9
8787 (E) family planning, including adoption services or reproductive 10
8888 services, as well as assessments related to child protection services; 11
8989 (F) health care or health insurance, including mental health care, 12
9090 dental, or vision; 13
9191 (G) financial services, including a financial service provided by a 14
9292 mortgage company, mortgage broker, or creditor; 15
9393 (H) law enforcement activities, including the allocation of law 16
9494 enforcement personnel or assets, the enforcement of laws, maintaining public 17
9595 order, or managing public safety; 18
9696 (I) government services, including the determination, allocation, or 19
9797 denial of public benefits and services; and 20 BILL AS INTRODUCED H.340
9898 2025 Page 5 of 23
9999
100100
101101 VT LEG #378965 v.1
102102 (J) a reasonable accommodation or other right granted under the civil 1
103103 rights laws of this State. 2
104104 (5) “Consumer” means an individual who is a resident of the State. 3
105105 (6) “Deployer” means a person doing business in this State that uses an 4
106106 automated decision system in a consequential decision in the State or provides 5
107107 an automated decision system for use in a consequential decision by the 6
108108 general public in the State. A developer shall also be considered a deployer if 7
109109 its actions satisfy this definition. 8
110110 (7) “Deployer-employer” means a deployer that is an employer. 9
111111 (8) “Developer” means a person doing business in this State that 10
112112 designs, codes, or produces an automated decision system for use in a 11
113113 consequential decision or creates a substantial change with respect to an 12
114114 automated decision system for use in a consequential decision, whether for its 13
115115 own use in the State or for use by a third party in the State. 14
116116 (9) “Developer-employer” means a developer that is an employer. 15
117117 (10) “Employee” means an individual who performs services for and 16
118118 under the control and direction of an employer for wages or other 17
119119 remuneration, including former employees, or natural persons employed as 18
120120 independent contractors to carry out work in furtherance of an employer’s 19
121121 business enterprise who are not themselves employers. 20 BILL AS INTRODUCED H.340
122122 2025 Page 6 of 23
123123
124124
125125 VT LEG #378965 v.1
126126 (11) “Employer” means any person, firm, partnership, institution, 1
127127 corporation, or association that employs one or more employees. 2
128128 (12) “Software stack” means the group of individual software 3
129129 components that work together to support the execution of an automated 4
130130 decision system. 5
131131 (13) “Substantial change” means any: 6
132132 (A) deliberate change to an automated decision system that would 7
133133 result in material inaccuracies in the reports created under section 4193f of this 8
134134 title; or 9
135135 (B) substantial change in the data that the automated decision system 10
136136 uses as input or training data. 11
137137 § 4193b. ALGORITHMIC DISCRIMINATION 12
138138 It shall be unlawful discrimination for a developer or deployer to use, sell, 13
139139 or share an automated decision system for use in a consequential decision or a 14
140140 product featuring an automated decision system for use in a consequential 15
141141 decision that produces algorithmic discrimination. 16
142142 § 4193c. DEPLOYER AND DEVELOPER OBLIGATIONS 17
143143 (a) Any deployer that employs an automated decision system for a 18
144144 consequential decision shall inform the consumer prior to the use of the system 19
145145 for a consequential decision in clear, conspicuous, and consumer-friendly 20
146146 terms, made available in each of the languages in which the company offers its 21 BILL AS INTRODUCED H.340
147147 2025 Page 7 of 23
148148
149149
150150 VT LEG #378965 v.1
151151 end services, that automated decision systems will be used to make a 1
152152 consequential decision or to assist in making a consequential decision. 2
153153 (b) Any notice provided by a deployer to the consumer pursuant to 3
154154 subsection (a) of this section shall include: 4
155155 (1) a description of the personal characteristics or attributes that the 5
156156 system will measure or assess; 6
157157 (2) the method by which the system measures or assesses those 7
158158 attributes or characteristics; 8
159159 (3) how those attributes or characteristics are relevant to the 9
160160 consequential decisions for which the system should be used; 10
161161 (4) any human components of the system; 11
162162 (5) how any automated components of the system are used to inform the 12
163163 consequential decision; and 13
164164 (6) a direct link to a publicly accessible page on the deployer’s website 14
165165 that contains a plain-language description of the: 15
166166 (A) system’s outputs; 16
167167 (B) types and sources of data collected from natural persons and 17
168168 processed by the system when it is used to make, or assists in making, a 18
169169 consequential decision; and 19
170170 (C) results of the most recent impact assessment, or an active link to 20
171171 a web page where a consumer can review those results. 21 BILL AS INTRODUCED H.340
172172 2025 Page 8 of 23
173173
174174
175175 VT LEG #378965 v.1
176176 (c) Any deployer that employs an automated decision system for a 1
177177 consequential decision shall provide the consumer with a single notice 2
178178 containing a plain-language explanation of the decision that identifies the 3
179179 principal reason or reasons for the consequential decision, including: 4
180180 (1) the identity of the developer of the automated decision system used 5
181181 in the consequential decision, if the deployer is not also the developer; 6
182182 (2) a description of what the output of the automated decision system is, 7
183183 such as a score, recommendation, or other similar description; 8
184184 (3) the degree and manner to which the automated decision system 9
185185 contributed to the decision; 10
186186 (4) the types and sources of data processed by the automated decision 11
187187 system in making the consequential decision; 12
188188 (5) a plain language explanation of how the consumer’s personal data 13
189189 informed the consequential decision; and 14
190190 (6) what actions, if any, the consumer might have taken to secure a 15
191191 different decision and the actions that the consumer might take to secure a 16
192192 different decision in the future. 17
193193 (d)(1) A deployer shall provide and explain a process for a consumer to 18
194194 appeal a decision, which shall at minimum allow the consumer to: 19
195195 (A) formally contest the decision; 20
196196 (B) provide information to support their position; and 21 BILL AS INTRODUCED H.340
197197 2025 Page 9 of 23
198198
199199
200200 VT LEG #378965 v.1
201201 (C) obtain meaningful human review of the decision. 1
202202 (2) For an appeal made pursuant to subdivision (1) of this subsection: 2
203203 (A) a deployer shall designate a human reviewer who: 3
204204 (i) is trained and qualified to understand the consequential 4
205205 decision being appealed, the consequences of the decision for the consumer, 5
206206 how to evaluate and how to serve impartially, including by avoiding 6
207207 prejudgment of the facts at issue, conflict of interest, and bias; 7
208208 (ii) does not have a conflict of interest for or against the deployer 8
209209 or the consumer; 9
210210 (iii) was not involved in the initial decision being appealed; 10
211211 (iv) shall enjoy protection from dismissal or its equivalent, 11
212212 disciplinary measures, or other adverse treatment for exercising their functions 12
213213 under this section; and 13
214214 (v) shall be allocated sufficient human resources by the deployer 14
215215 to conduct an effective appeal of the decision; and 15
216216 (B) the human reviewer shall consider the information provided by 16
217217 the consumer in their appeal and may consider other sources of information 17
218218 relevant to the consequential decision. 18
219219 (3) A deployer shall respond to a consumer’s appeal not later than 45 19
220220 after receipt of the appeal. That period may be extended once by an additional 20
221221 45 days where reasonably necessary, taking into account the complexity and 21 BILL AS INTRODUCED H.340
222222 2025 Page 10 of 23
223223
224224
225225 VT LEG #378965 v.1
226226 number of appeals. The deployer shall inform the consumer of any extension 1
227227 not later than 45 days after receipt of the appeal, together with the reasons for 2
228228 the delay. 3
229229 (e) The deployer or developer of an automated decision system is legally 4
230230 responsible for the quality and accuracy of all consequential decisions made, 5
231231 including any bias or algorithmic discrimination resulting from the operation 6
232232 of the automated decision system. 7
233233 (f) A developer shall not use, sell, or share an automated decision system 8
234234 for use in a consequential decision or a product featuring an automated 9
235235 decision system for use in a consequential decision that has not passed an 10
236236 independent audit, in accordance with section 4193e of this title. If an 11
237237 independent audit finds that an automated decision system for use in a 12
238238 consequential decision does produce algorithmic discrimination, the developer 13
239239 shall not use, sell, or share the system until the algorithmic discrimination has 14
240240 been proven to be rectified by a post-adjustment audit. 15
241241 (g) Except as provided in subsection 4193e(a) of this title, the rights and 16
242242 obligations under this section may not be waived by any person, partnership, 17
243243 association, or corporation. 18
244244 § 4193d. WHISTLEBLOWER PROTECTIONS 19
245245 (a) Developer-employers and deployer-employers of automated decision 20
246246 systems used in consequential decisions shall not: 21 BILL AS INTRODUCED H.340
247247 2025 Page 11 of 23
248248
249249
250250 VT LEG #378965 v.1
251251 (1) prevent an employee from disclosing information to the Attorney 1
252252 General, including through terms and conditions of employment or seeking 2
253253 to enforce terms and conditions of employment, if the employee has reasonable 3
254254 cause to believe the information indicates a violation of this subchapter; or 4
255255 (2) retaliate against an employee for disclosing information to the 5
256256 Attorney General pursuant to subdivision (1) of this subsection. 6
257257 (b) Developer-employers and deployer-employers of automated decision 7
258258 systems used in consequential decisions shall provide a clear notice to all 8
259259 employees working on automated decision systems of their rights and 9
260260 responsibilities under this subchapter, including the right of employees of 10
261261 contractors and subcontractors to use the developer’s internal process for 11
262262 making protected disclosures pursuant to subsection (c) of this section. A 12
263263 developer-employer or deployer-employer is presumed to be in compliance 13
264264 with the requirements of this subsection if the developer-employer or deployer-14
265265 employer does either of the following: 15
266266 (1) at all times: 16
267267 (A) posts and displays within all workplaces maintained by 17
268268 the developer-employer or deployer-employer a notice to all employees of 18
269269 their rights and responsibilities under this subchapter; 19
270270 (B) ensures that all new employees receive equivalent notice; and 20 BILL AS INTRODUCED H.340
271271 2025 Page 12 of 23
272272
273273
274274 VT LEG #378965 v.1
275275 (C) ensures that employees who work remotely periodically receive 1
276276 an equivalent notice; or 2
277277 (2) not less frequently than once every year, provides written notice 3
278278 to all employees of their rights and responsibilities under this subchapter and 4
279279 ensures that the notice is received and acknowledged by all of those 5
280280 employees. 6
281281 (c) Each developer-employer shall provide a reasonable internal process 7
282282 through which an employee may anonymously disclose information to the 8
283283 developer if the employee believes in good faith that the information indicates 9
284284 that the developer has violated any provision of this subchapter or any other 10
285285 law, or has made false or materially misleading statements related to its safety 11
286286 and security protocol, or failed to disclose known risks to employees, 12
287287 including, at a minimum, a monthly update to the person who made the 13
288288 disclosure regarding the status of the developer’s investigation of the 14
289289 disclosure and the actions taken by the developer in response to the disclosure. 15
290290 § 4193e. AUDITS 16
291291 (a) Prior to deployment of an automated decision system for use in a 17
292292 consequential decision, six months after deployment, and at least every 18 18
293293 months thereafter for each calendar year an automated decision system is in 19
294294 use in consequential decisions after the first post-deployment audit, the 20
295295 developer and deployer shall be jointly responsible for ensuring that an 21 BILL AS INTRODUCED H.340
296296 2025 Page 13 of 23
297297
298298
299299 VT LEG #378965 v.1
300300 independent audit is conducted in compliance with the provisions of this 1
301301 section to ensure that the product does not produce algorithmic discrimination 2
302302 and complies with the provisions of this subchapter. The developer and 3
303303 deployer shall enter into a contract specifying which party is responsible for 4
304304 the costs, oversight, and results of the audit. Absent an agreement of 5
305305 responsibility through contract, the developer and deployer shall be jointly and 6
306306 severally liable for any violations of this section. Regardless of final findings, 7
307307 the deployer or developer shall deliver all audits conducted under this section 8
308308 to the Attorney General. 9
309309 (b) A deployer or developer may contract with more than one auditor to 10
310310 fulfill the requirements of this section. 11
311311 (c) The audit shall include the following: 12
312312 (1) an analysis of data management policies, including whether personal 13
313313 or sensitive data relating to a consumer is subject to data security protection 14
314314 standards that comply with the requirements of applicable State law; 15
315315 (2) an analysis of the system validity and reliability according to each 16
316316 specified use case listed in the entity’s reporting document filed by the 17
317317 developer or deployer pursuant to section 4193f of this title; 18
318318 (3) a comparative analysis of the system’s performance when used on 19
319319 consumers of different demographic groups and a determination of whether the 20
320320 system produces algorithmic discrimination in violation of this subchapter by 21 BILL AS INTRODUCED H.340
321321 2025 Page 14 of 23
322322
323323
324324 VT LEG #378965 v.1
325325 each intended and foreseeable identified use as identified by the deployer and 1
326326 developer pursuant to section 4193f of this title; 2
327327 (4) an analysis of how the technology complies with existing relevant 3
328328 federal, State, and local labor, civil rights, consumer protection, privacy, and 4
329329 data privacy laws; and 5
330330 (5) an evaluation of the developer’s or deployer’s documented risk 6
331331 management policy and program as set forth in section 4193g of this title for 7
332332 conformity with subsection 4193g(a) of this title. 8
333333 (d) The Attorney General may adopt further rules as necessary to ensure 9
334334 that audits under this section assess whether or not automated decision systems 10
335335 used in consequential decisions produce algorithmic discrimination and 11
336336 otherwise comply with the provisions of this subchapter. 12
337337 (e) The independent auditor shall have complete and unredacted copies of 13
338338 all reports previously filed by the deployer or developer pursuant to section 14
339339 4193f of this title. 15
340340 (f) An audit conducted under this section shall be completed in its entirety 16
341341 without the assistance of an automated decision system. 17
342342 (g)(1) An auditor shall be an independent entity, including an individual, 18
343343 nonprofit, firm, corporation, partnership, cooperative, or association. 19 BILL AS INTRODUCED H.340
344344 2025 Page 15 of 23
345345
346346
347347 VT LEG #378965 v.1
348348 (2) For the purposes of this subchapter, no auditor may be 1
349349 commissioned by a developer or deployer of an automated decision system 2
350350 used in consequential decisions if the auditor: 3
351351 (A) has already been commissioned to provide any auditing or 4
352352 nonauditing service, including financial auditing, cybersecurity auditing, or 5
353353 consulting services of any type, to the commissioning company in the past 12 6
354354 months; 7
355355 (B) is or was involved in using, developing, integrating, offering, 8
356356 licensing, or deploying the automated decision system; 9
357357 (C) has or had an employment relationship with a developer or 10
358358 deployer that uses, offers, or licenses the automated decision system; or 11
359359 (D) has or had a direct financial interest or a material indirect 12
360360 financial interest in a developer or deployer that uses, offers, or licenses the 13
361361 automated decision system. 14
362362 (3) Fees paid to auditors may not be contingent on the result of the audit 15
363363 and the commissioning company shall not provide any incentives or bonuses 16
364364 for a positive audit result. 17
365365 (h) The Attorney General may adopt rules to ensure: 18
366366 (1) the independence of auditors under this section; 19 BILL AS INTRODUCED H.340
367367 2025 Page 16 of 23
368368
369369
370370 VT LEG #378965 v.1
371371 (2) that teams conducting audits incorporate feedback from communities 1
372372 that may foreseeably be the subject of algorithmic discrimination with respect 2
373373 to the automated decision system being audited; and 3
374374 (3) that the requirements of an audit as set forth in subsection (c) of this 4
375375 section are updated to reflect responsible evaluation practices and include 5
376376 adequate information to enforce this subchapter. 6
377377 § 4193f. AUTOMATED DECISION SYSTEM REPORTING 7
378378 REQUIREMENTS 8
379379 (a) Every developer and deployer of an automated decision system used in 9
380380 a consequential decision shall comply with the reporting requirements of this 10
381381 section. Regardless of final findings, reports shall be filed with the Attorney 11
382382 General prior to deployment of an automated decision system used in a 12
383383 consequential decision and then annually, or after each substantial change to 13
384384 the system, whichever comes first. 14
385385 (b) Together with each report required to be filed under this section, 15
386386 developers and deployers shall file with the Attorney General a copy of the last 16
387387 completed independent audit required by this subchapter and a legal attestation 17
388388 that the automated decision system used in a consequential decision: 18
389389 (1) does not violate any provision of this subchapter; or 19 BILL AS INTRODUCED H.340
390390 2025 Page 17 of 23
391391
392392
393393 VT LEG #378965 v.1
394394 (2) may violate or does violate one or more provisions of this article, 1
395395 that there is a plan of remediation to bring the automated decision system into 2
396396 compliance with this subchapter, and a summary of the plan of remediation. 3
397397 (c) Developers of automated decision systems shall file with the Attorney 4
398398 General a report containing the following: 5
399399 (1) a description of the system including: 6
400400 (A) a description of the system’s software stack; 7
401401 (B) the purpose of the system and its expected benefits; and 8
402402 (C) the system’s current and intended uses, including what 9
403403 consequential decisions it will support and what stakeholders will be impacted; 10
404404 (2) the intended outputs of the system and whether the outputs can be or 11
405405 are otherwise appropriate to be used for any purpose not previously articulated; 12
406406 (3) the methods for training of their models including: 13
407407 (A) any pre-processing steps taken to prepare datasets for the training 14
408408 of a model underlying an automated decision system; 15
409409 (B) descriptions of the datasets upon which models were trained and 16
410410 evaluated, how and why datasets were collected and the sources of those 17
411411 datasets, and how that training data will be used and maintained; 18
412412 (C) the quality and appropriateness of the data used in the automated 19
413413 decision system’s design, development, testing, and operation; 20 BILL AS INTRODUCED H.340
414414 2025 Page 18 of 23
415415
416416
417417 VT LEG #378965 v.1
418418 (D) whether the data contains sufficient breadth to address the range 1
419419 of real-world inputs the automated decision system might encounter and how 2
420420 any data gaps have been addressed; and 3
421421 (E) steps taken to ensure compliance with privacy, data privacy, 4
422422 data security, and copyright laws; 5
423423 (4) use and data management policies; 6
424424 (5) any other information necessary to allow the deployer to understand 7
425425 the outputs and monitor the system for compliance with this subchapter; 8
426426 (6) any other information necessary to allow the deployer to comply 9
427427 with the requirements of subsection (d) of this section; 10
428428 (7) a description of the system’s capabilities and any developer-imposed 11
429429 limitations, including capabilities outside of its intended use, when the system 12
430430 should not be used, any safeguards or guardrails in place to protect against 13
431431 unintended, inappropriate, or disallowed uses, and testing of any safeguards or 14
432432 guardrails; 15
433433 (8) an internal risk assessment including documentation and results of 16
434434 testing conducted to identify all reasonably foreseeable risks related to 17
435435 algorithmic discrimination, validity and reliability, privacy and autonomy, and 18
436436 safety and security, as well as actions taken to address those risks, and 19
437437 subsequent testing to assess the efficacy of actions taken to address risks; and 20 BILL AS INTRODUCED H.340
438438 2025 Page 19 of 23
439439
440440
441441 VT LEG #378965 v.1
442442 (9) whether the system should be monitored and, if so, how the system 1
443443 should be monitored. 2
444444 (d) Deployers of automated decision systems used in consequential 3
445445 decisions shall file with the Attorney General a report containing the 4
446446 following: 5
447447 (1) a description of the system, including: 6
448448 (A) a description of the system’s software stack; 7
449449 (B) the purpose of the system and its expected benefits; and 8
450450 (C) the system’s current and intended uses, including what 9
451451 consequential decisions it will support and what stakeholders will be impacted; 10
452452 (2) the intended outputs of the system and whether the outputs can be 11
453453 or are otherwise appropriate to be used for any purpose not previously 12
454454 articulated; 13
455455 (3) whether the deployer collects revenue or plans to collect revenue 14
456456 from use of the automated decision system in a consequential decision and, if 15
457457 so, how it monetizes or plans to monetize use of the system; 16
458458 (4) whether the system is designed to make consequential decisions 17
459459 itself or whether and how it supports consequential decisions; 18
460460 (5) a description of the system’s capabilities and any deployer-imposed 19
461461 limitations, including capabilities outside of its intended use, when the system 20
462462 should not be used, any safeguards or guardrails in place to protect against 21 BILL AS INTRODUCED H.340
463463 2025 Page 20 of 23
464464
465465
466466 VT LEG #378965 v.1
467467 unintended, inappropriate, or disallowed uses, and testing of any safeguards or 1
468468 guardrails; 2
469469 (6) an assessment of the relative benefits and costs to the consumer 3
470470 given the system’s purpose, capabilities, and probable use cases; 4
471471 (7) an internal risk assessment including documentation and results of 5
472472 testing conducted to identify all reasonably foreseeable risks related to 6
473473 algorithmic discrimination, accuracy and reliability, privacy and autonomy, 7
474474 and safety and security, as well as actions taken to address those risks, and 8
475475 subsequent testing to assess the efficacy of actions taken to address risks; and 9
476476 (8) whether the system should be monitored and, if so, how the 10
477477 system should be monitored. 11
478478 (e) The Attorney General shall: 12
479479 (1) adopt rules: 13
480480 (A) for a process whereby developers and deployers may request 14
481481 redaction of portions of reports required under this section to ensure that they 15
482482 are not required to disclose sensitive and protected information; and 16
483483 (B) to determine reasonably foreseeable risks related to algorithmic 17
484484 discrimination, validity and reliability, privacy and autonomy, and safety and 18
485485 security, pursuant to subsections (c) and (d) of this section; and 19 BILL AS INTRODUCED H.340
486486 2025 Page 21 of 23
487487
488488
489489 VT LEG #378965 v.1
490490 (2) maintain an online database that is accessible to the general public 1
491491 with reports, redacted in accordance with this section, and audits required by 2
492492 this subchapter, which shall be updated biannually. 3
493493 (f) For automated decision systems already in deployment for use in 4
494494 consequential decisions on or before July 1, 2025, developers and deployers 5
495495 shall not later than 18 months after July 1, 2025 complete and file the reports 6
496496 and complete the independent audit required by this subchapter. 7
497497 § 4193g. RISK MANAGEMENT POLICY AND PROGRAM 8
498498 (a) Each developer or deployer of automated decision systems used in 9
499499 consequential decisions shall plan, document, and implement a risk 10
500500 management policy and program to govern development or deployment, as 11
501501 applicable, of the automated decision system. The risk management policy and 12
502502 program shall specify and incorporate the principles, processes, and personnel 13
503503 that the deployer uses to identify, document, and mitigate known or reasonably 14
504504 foreseeable risks of algorithmic discrimination covered under section 4193b of 15
505505 this title. The risk management policy and program shall be an iterative 16
506506 process planned, implemented, and regularly and systematically reviewed and 17
507507 updated over the life cycle of an automated decision system, requiring regular, 18
508508 systematic review and updates, including updates to documentation. A risk 19
509509 management policy and program implemented and maintained pursuant to this 20
510510 subsection shall be reasonable considering the: 21 BILL AS INTRODUCED H.340
511511 2025 Page 22 of 23
512512
513513
514514 VT LEG #378965 v.1
515515 (1) guidance and standards set forth in version 1.0 of the Artificial 1
516516 Intelligence Risk Management Framework published by the National Institute 2
517517 of Standards and Technology in the U.S. Department of Commerce, or the 3
518518 latest version of the Artificial Intelligence Risk Management Framework 4
519519 published by the National Institute of Standards and Technology if, in the 5
520520 Attorney General’s discretion, the latest version of the Artificial Intelligence 6
521521 Risk Management Framework published by the National Institute of Standards 7
522522 and Technology in the U.S. Department of Commerce is at least as stringent as 8
523523 version 1.0; 9
524524 (2) size and complexity of the developer or deployer; 10
525525 (3) nature, scope, and intended uses of the automated decision system 11
526526 developed or deployed for use in consequential decisions; and 12
527527 (4) sensitivity and volume of data processed in connection with 13
528528 the automated decision system. 14
529529 (b) A risk management policy and program implemented pursuant to 15
530530 subsection (a) of this section may cover multiple automated decision systems 16
531531 developed by the same developer or deployed by the same deployer for use in 17
532532 consequential decisions if sufficient. 18
533533 (c) The Attorney General may require a developer or a deployer to 19
534534 disclose the risk management policy and program implemented pursuant to 20 BILL AS INTRODUCED H.340
535535 2025 Page 23 of 23
536536
537537
538538 VT LEG #378965 v.1
539539 subsection (a) of this section in a form and manner prescribed by the Attorney 1
540540 General. The Attorney General may evaluate the risk management policy and 2
541541 program to ensure compliance with this section. 3
542542 § 4193h. ENFORCEMENT AND RULEMAKING 4
543543 (a) A person who violates this subchapter or rules adopted pursuant to this 5
544544 subchapter commits an unfair and deceptive act in commerce in violation of 6
545545 section 2453 of this title (Vermont Consumer Protection Act). A consumer 7
546546 harmed by a violation is eligible to all remedies provided under the Vermont 8
547547 Consumer Protection Act. 9
548548 (b) The Attorney General has the same authority to adopt rules to 10
549549 implement the provisions of this section and to conduct civil investigations, 11
550550 enter into assurances of discontinuance, bring civil actions, and take other 12
551551 enforcement actions as provided under chapter 63, subchapter 1 of this title. 13
552552 Sec. 2. EFFECTIVE DATE 14
553553 This act shall take effect on July 1, 2025. 15