Vermont 2025 2025-2026 Regular Session

Vermont House Bill H0340 Introduced / Bill

Filed 02/24/2025

                    BILL AS INTRODUCED 	H.340 
2025 	Page 1 of 23 
 
 
VT LEG #378965 v.1 
H.340 1 
Introduced by Representatives Priestley of Bradford, Arsenault of Williston, 2 
Berbeco of Winooski, Cole of Hartford, Logan of Burlington, 3 
Masland of Thetford, McGill of Bridport, Sibilia of Dover, and 4 
White of Waitsfield 5 
Referred to Committee on  6 
Date:  7 
Subject: Commerce and trade; consumer protection; artificial intelligence  8 
Statement of purpose of bill as introduced:  This bill proposes to regulate 9 
developers and deployers of automated decision systems used in consequential 10 
decisions in an effort to avoid algorithmic discrimination towards consumers. 11 
An act relating to regulating developers and deployers of certain automated 12 
decision systems 13 
It is hereby enacted by the General Assembly of the State of Vermont:  14 
Sec. 1.  9 V.S.A. chapter 118 is added to read: 15 
CHAPTER 118.  ARTIFICIAL INTELLIGENCE 16 
Subchapter 1.  Algorithmic Discrimination and Automated Decision Systems 17 
§ 4193a.  DEFINITIONS 18 
As used in this subchapter: 19  BILL AS INTRODUCED 	H.340 
2025 	Page 2 of 23 
 
 
VT LEG #378965 v.1 
(1)(A)  “Algorithmic discrimination” means any condition in which the 1 
use of an automated decision system results in a differential treatment or 2 
impact that disfavors an individual on the basis of the individual’s actual or 3 
perceived age, color, disability, ethnicity, genetic information, immigration or 4 
citizenship status, limited proficiency in the English language, national origin, 5 
race, religion, reproductive health, sex, sexual orientation, gender identity, 6 
veteran status, or other classification protected under the laws of this State or 7 
federal law. 8 
(B)  “Algorithmic discrimination” does not include: 9 
(i)  a developer’s or deployer’s testing of the developer’s or 10 
deployer’s own automated decision system to identify, mitigate, and prevent 11 
discrimination; 12 
(ii)  expanding an applicant, customer, or participant pool to 13 
increase diversity or redress historical discrimination; or 14 
(iii)  an act or omission by or on behalf of a private club or other 15 
establishment that is not in fact open to the public, as set forth in Title II of the 16 
federal Civil Rights Act of 1964, 42 U.S.C.§ 2000a(e), as amended. 17 
(2)  “Auditor” refers to an independent entity, including an individual, a 18 
nonprofit, a firm, a corporation, a partnership, a cooperative, or an association, 19 
commissioned to perform an audit. 20  BILL AS INTRODUCED 	H.340 
2025 	Page 3 of 23 
 
 
VT LEG #378965 v.1 
(3)(A) “Automated decision system” means a computational process 1 
derived from machine learning, statistical modeling, data analytics, or artificial 2 
intelligence that issues an output, including a score, classification, or 3 
recommendation. 4 
(B)  “Automated decision system” does not include any software used 5 
primarily for basic computerized processes, such as antimalware, antivirus, 6 
autocorrect functions, calculators, databases, data storage, electronic 7 
communications, firewall, internet domain registration, website loading, 8 
networking, spam and robocall filtering, spellcheck tools, spreadsheets, web 9 
caching, web hosting, or any tool that relates only to nonemployment internal 10 
management affairs such as ordering office supplies or processing payments, 11 
and that do not materially affect the rights, liberties, benefits, safety, or welfare 12 
of any individual within the State. 13 
(4)  “Consequential decision” means a decision that has a material, legal, 14 
or similarly significant effect on the provision or denial to any consumer of, or 15 
the cost, terms, or availability of: 16 
(A)  educational and vocational training, including: 17 
(i)  assessment or grading, including detecting student cheating or 18 
plagiarism;     19 
(ii)  accreditation;     20 
(iii)  certification;     21  BILL AS INTRODUCED 	H.340 
2025 	Page 4 of 23 
 
 
VT LEG #378965 v.1 
(iv)  admissions or enrollment; and     1 
(v)  financial aid or scholarships; 2 
(B)  employment or an employment opportunity, including: 3 
(i)  pay or promotion; 4 
(ii)  hiring or termination; and  5 
(iii)  automated task allocation; 6 
(C)  housing or lodging, including long-term or short-term rentals; 7 
(D)  essential utilities, including electricity, heat, water, internet or 8 
telecommunications access, or transportation; 9 
(E)  family planning, including adoption services or reproductive 10 
services, as well as assessments related to child protection services; 11 
(F)  health care or health insurance, including mental health care, 12 
dental, or vision;  13 
(G)  financial services, including a financial service provided by a  14 
mortgage company, mortgage broker, or creditor; 15 
(H)  law enforcement activities, including the allocation of law 16 
enforcement personnel or assets, the enforcement of laws, maintaining public 17 
order, or managing public safety; 18 
(I)  government services, including the determination, allocation, or 19 
denial of public benefits and services; and  20  BILL AS INTRODUCED 	H.340 
2025 	Page 5 of 23 
 
 
VT LEG #378965 v.1 
(J)  a reasonable accommodation or other right granted under the civil 1 
rights laws of this State. 2 
(5)  “Consumer” means an individual who is a resident of the State. 3 
(6)  “Deployer” means a person doing business in this State that uses an 4 
automated decision system in a consequential decision in the State or provides 5 
an automated decision system for use in a consequential decision by the 6 
general public in the State.  A developer shall also be considered a deployer if 7 
its actions satisfy this definition. 8 
(7)  “Deployer-employer” means a deployer that is an employer. 9 
(8)  “Developer” means a person doing business in this State that 10 
designs, codes, or produces an automated decision system for use in a 11 
consequential decision or creates a substantial change with respect to an 12 
automated decision system for use in a consequential decision, whether for its 13 
own use in the State or for use by a third party in the State. 14 
(9)  “Developer-employer” means a developer that is an employer. 15 
(10)  “Employee” means an individual who performs services for and 16 
under the control and direction of an employer for wages or other 17 
remuneration, including former employees, or natural persons employed as 18 
independent contractors to carry out work in furtherance of an employer’s 19 
business enterprise who are not themselves employers. 20  BILL AS INTRODUCED 	H.340 
2025 	Page 6 of 23 
 
 
VT LEG #378965 v.1 
(11)  “Employer” means any person, firm, partnership, institution, 1 
corporation, or association that employs one or more employees. 2 
(12)  “Software stack” means the group of individual software 3 
components that work together to support the execution of an automated 4 
decision system. 5 
(13)  “Substantial change” means any:  6 
(A)  deliberate change to an automated decision system that would 7 
result in material inaccuracies in the reports created under section 4193f of this 8 
title; or  9 
(B)  substantial change in the data that the automated decision system 10 
uses as input or training data. 11 
§ 4193b.  ALGORITHMIC DISCRIMINATION 12 
It shall be unlawful discrimination for a developer or deployer to use, sell, 13 
or share an automated decision system for use in a consequential decision or a 14 
product featuring an automated decision system for use in a consequential 15 
decision that produces algorithmic discrimination. 16 
§ 4193c.  DEPLOYER AND DEVELOPER OBLIGATIONS 17 
(a) Any deployer that employs an automated decision system for a 18 
consequential decision shall inform the consumer prior to the use of the system 19 
for a consequential decision in clear, conspicuous, and consumer-friendly 20 
terms, made available in each of the languages in which the company offers its 21  BILL AS INTRODUCED 	H.340 
2025 	Page 7 of 23 
 
 
VT LEG #378965 v.1 
end services, that automated decision systems will be used to make a 1 
consequential decision or to assist in making a consequential decision. 2 
(b) Any notice provided by a deployer to the consumer pursuant to 3 
subsection (a) of this section shall include:  4 
(1)  a description of the personal characteristics or attributes that the 5 
system will measure or assess;  6 
(2) the method by which the system measures or assesses those 7 
attributes or characteristics;  8 
(3)  how those attributes or characteristics are relevant to the 9 
consequential decisions for which the system should be used;  10 
(4) any human components of the system;  11 
(5)  how any automated components of the system are used to inform the 12 
consequential decision; and  13 
(6) a direct link to a publicly accessible page on the deployer’s website 14 
that contains a plain-language description of the:  15 
(A)  system’s outputs; 16 
(B)  types and sources of data collected from natural persons and 17 
processed by the system when it is used to make, or assists in making, a 18 
consequential decision; and  19 
(C)  results of the most recent impact assessment, or an active link to 20 
a web page where a consumer can review those results. 21  BILL AS INTRODUCED 	H.340 
2025 	Page 8 of 23 
 
 
VT LEG #378965 v.1 
(c)  Any deployer that employs an automated decision system for a 1 
consequential decision shall provide the consumer with a single notice 2 
containing a plain-language explanation of the decision that identifies the 3 
principal reason or reasons for the consequential decision, including:  4 
(1)  the identity of the developer of the automated decision system used 5 
in the consequential decision, if the deployer is not also the developer;  6 
(2) a description of what the output of the automated decision system is, 7 
such as a score, recommendation, or other similar description; 8 
(3) the degree and manner to which the automated decision system 9 
contributed to the decision;  10 
(4)  the types and sources of data processed by the automated decision 11 
system in making the consequential decision;  12 
(5) a plain language explanation of how the consumer’s personal data 13 
informed the consequential decision; and 14 
(6)  what actions, if any, the consumer might have taken to secure a 15 
different decision and the actions that the consumer might take to secure a 16 
different decision in the future. 17 
(d)(1)  A deployer shall provide and explain a process for a consumer to 18 
appeal a decision, which shall at minimum allow the consumer to: 19 
(A)  formally contest the decision; 20 
(B)  provide information to support their position; and  21  BILL AS INTRODUCED 	H.340 
2025 	Page 9 of 23 
 
 
VT LEG #378965 v.1 
(C)  obtain meaningful human review of the decision.  1 
(2)  For an appeal made pursuant to subdivision (1) of this subsection: 2 
(A)  a deployer shall designate a human reviewer who: 3 
(i) is trained and qualified to understand the consequential 4 
decision being appealed, the consequences of the decision for the consumer, 5 
how to evaluate and how to serve impartially, including by avoiding 6 
prejudgment of the facts at issue, conflict of interest, and bias;  7 
(ii) does not have a conflict of interest for or against the deployer 8 
or the consumer;  9 
(iii) was not involved in the initial decision being appealed; 10 
(iv) shall enjoy protection from dismissal or its equivalent, 11 
disciplinary measures, or other adverse treatment for exercising their functions 12 
under this section; and   13 
(v) shall be allocated sufficient human resources by the deployer 14 
to conduct an effective appeal of the decision; and 15 
(B)  the human reviewer shall consider the information provided by 16 
the consumer in their appeal and may consider other sources of information 17 
relevant to the consequential decision. 18 
(3)  A deployer shall respond to a consumer’s appeal not later than 45 19 
after receipt of the appeal.  That period may be extended once by an additional 20 
45 days where reasonably necessary, taking into account the complexity and 21  BILL AS INTRODUCED 	H.340 
2025 	Page 10 of 23 
 
 
VT LEG #378965 v.1 
number of appeals.  The deployer shall inform the consumer of any extension 1 
not later than 45 days after receipt of the appeal, together with the reasons for 2 
the delay. 3 
(e)  The deployer or developer of an automated decision system is legally 4 
responsible for the quality and accuracy of all consequential decisions made, 5 
including any bias or algorithmic discrimination resulting from the operation 6 
of the automated decision system. 7 
(f)  A developer shall not use, sell, or share an automated decision system 8 
for use in a consequential decision or a product featuring an automated 9 
decision system for use in a consequential decision that has not passed an 10 
independent audit, in accordance with section 4193e of this title.  If an 11 
independent audit finds that an automated decision system for use in a 12 
consequential decision does produce algorithmic discrimination, the developer 13 
shall not use, sell, or share the system until the algorithmic discrimination has 14 
been proven to be rectified by a post-adjustment audit. 15 
(g)  Except as provided in subsection 4193e(a) of this title, the rights and 16 
obligations under this section may not be waived by any person, partnership, 17 
association, or corporation. 18 
§ 4193d.  WHISTLEBLOWER PROTECTIONS 19 
(a)  Developer-employers and deployer-employers of automated decision 20 
systems used in consequential decisions shall not: 21  BILL AS INTRODUCED 	H.340 
2025 	Page 11 of 23 
 
 
VT LEG #378965 v.1 
(1)  prevent an employee from disclosing information to the Attorney 1 
General, including through terms and conditions of employment or seeking 2 
to enforce terms and conditions of employment, if the employee has reasonable 3 
cause to believe the information indicates a violation of this subchapter; or 4 
(2)  retaliate against an employee for disclosing information to the 5 
Attorney General pursuant to subdivision (1) of this subsection. 6 
(b)  Developer-employers and deployer-employers of automated decision 7 
systems used in consequential decisions shall provide a clear notice to all 8 
employees working on automated decision systems of their rights and 9 
responsibilities under this subchapter, including the right of employees of 10 
contractors and subcontractors to use the developer’s internal process for 11 
making protected disclosures pursuant to subsection (c) of this section.  A 12 
developer-employer or deployer-employer is presumed to be in compliance 13 
with the requirements of this subsection if the developer-employer or deployer-14 
employer does either of the following: 15 
(1)  at all times:  16 
(A)  posts and displays within all workplaces maintained by 17 
the developer-employer or deployer-employer a notice to all employees of 18 
their rights and responsibilities under this subchapter; 19 
(B)  ensures that all new employees receive equivalent notice; and  20  BILL AS INTRODUCED 	H.340 
2025 	Page 12 of 23 
 
 
VT LEG #378965 v.1 
(C)  ensures that employees who work remotely periodically receive 1 
an equivalent notice; or 2 
(2)  not less frequently than once every year, provides written notice 3 
to all employees of their rights and responsibilities under this subchapter and 4 
ensures that the notice is received and acknowledged by all of those 5 
employees. 6 
(c)  Each developer-employer shall provide a reasonable internal process 7 
through which an employee may anonymously disclose information to the 8 
developer if the employee believes in good faith that the information indicates 9 
that the developer has violated any provision of this subchapter or any other 10 
law, or has made false or materially misleading statements related to its safety 11 
and security protocol, or failed to disclose known risks to employees, 12 
including, at a minimum, a monthly update to the person who made the 13 
disclosure regarding the status of the developer’s investigation of the 14 
disclosure and the actions taken by the developer in response to the disclosure. 15 
§ 4193e.  AUDITS 16 
(a)  Prior to deployment of an automated decision system for use in a 17 
consequential decision, six months after deployment, and at least every 18 18 
months thereafter for each calendar year an automated decision system is in 19 
use in consequential decisions after the first post-deployment audit, the 20 
developer and deployer shall be jointly responsible for ensuring that an 21  BILL AS INTRODUCED 	H.340 
2025 	Page 13 of 23 
 
 
VT LEG #378965 v.1 
independent audit is conducted in compliance with the provisions of this 1 
section to ensure that the product does not produce algorithmic discrimination 2 
and complies with the provisions of this subchapter.  The developer and 3 
deployer shall enter into a contract specifying which party is responsible for 4 
the costs, oversight, and results of the audit.  Absent an agreement of 5 
responsibility through contract, the developer and deployer shall be jointly and 6 
severally liable for any violations of this section.  Regardless of final findings, 7 
the deployer or developer shall deliver all audits conducted under this section 8 
to the Attorney General. 9 
(b)  A deployer or developer may contract with more than one auditor to 10 
fulfill the requirements of this section. 11 
(c)  The audit shall include the following: 12 
(1)  an analysis of data management policies, including whether personal 13 
or sensitive data relating to a consumer is subject to data security protection 14 
standards that comply with the requirements of applicable State law; 15 
(2)  an analysis of the system validity and reliability according to each 16 
specified use case listed in the entity’s reporting document filed by the 17 
developer or deployer pursuant to section 4193f of this title; 18 
(3)  a comparative analysis of the system’s performance when used on 19 
consumers of different demographic groups and a determination of whether the 20 
system produces algorithmic discrimination in violation of this subchapter by 21  BILL AS INTRODUCED 	H.340 
2025 	Page 14 of 23 
 
 
VT LEG #378965 v.1 
each intended and foreseeable identified use as identified by the deployer and 1 
developer pursuant to section 4193f of this title; 2 
(4)  an analysis of how the technology complies with existing relevant 3 
federal, State, and local labor, civil rights, consumer protection, privacy, and 4 
data privacy laws; and 5 
(5)  an evaluation of the developer’s or deployer’s documented risk 6 
management policy and program as set forth in section 4193g of this title for 7 
conformity with subsection 4193g(a) of this title. 8 
(d)  The Attorney General may adopt further rules as necessary to ensure 9 
that audits under this section assess whether or not automated decision systems 10 
used in consequential decisions produce algorithmic discrimination and 11 
otherwise comply with the provisions of this subchapter. 12 
(e)   The independent auditor shall have complete and unredacted copies of 13 
all reports previously filed by the deployer or developer pursuant to section 14 
4193f of this title. 15 
(f)  An audit conducted under this section shall be completed in its entirety 16 
without the assistance of an automated decision system. 17 
(g)(1)  An auditor shall be an independent entity, including an individual, 18 
nonprofit, firm, corporation, partnership, cooperative, or association. 19  BILL AS INTRODUCED 	H.340 
2025 	Page 15 of 23 
 
 
VT LEG #378965 v.1 
(2)  For the purposes of this subchapter, no auditor may be 1 
commissioned by a developer or deployer of an automated decision system 2 
used in consequential decisions if the auditor:  3 
(A)  has already been commissioned to provide any auditing or 4 
nonauditing service, including financial auditing, cybersecurity auditing, or 5 
consulting services of any type, to the commissioning company in the past 12 6 
months; 7 
(B)  is or was involved in using, developing, integrating, offering, 8 
licensing, or deploying the automated decision system;  9 
(C)  has or had an employment relationship with a developer or 10 
deployer that uses, offers, or licenses the automated decision system; or   11 
(D)  has or had a direct financial interest or a material indirect 12 
financial interest in a developer or deployer that uses, offers, or licenses the 13 
automated decision system. 14 
(3)  Fees paid to auditors may not be contingent on the result of the audit 15 
and the commissioning company shall not provide any incentives or bonuses 16 
for a positive audit result. 17 
(h)  The Attorney General may adopt rules to ensure:  18 
(1)  the independence of auditors under this section; 19  BILL AS INTRODUCED 	H.340 
2025 	Page 16 of 23 
 
 
VT LEG #378965 v.1 
(2)  that teams conducting audits incorporate feedback from communities 1 
that may foreseeably be the subject of algorithmic discrimination with respect 2 
to the automated decision system being audited; and 3 
(3)  that the requirements of an audit as set forth in subsection (c) of this 4 
section are updated to reflect responsible evaluation practices and include 5 
adequate information to enforce this subchapter.  6 
§ 4193f.  AUTOMATED DECISION SYSTEM REPORTING  7 
               REQUIREMENTS 8 
(a)  Every developer and deployer of an automated decision system used in 9 
a consequential decision shall comply with the reporting requirements of this 10 
section.  Regardless of final findings, reports shall be filed with the Attorney 11 
General prior to deployment of an automated decision system used in a 12 
consequential decision and then annually, or after each substantial change to 13 
the system, whichever comes first. 14 
(b)  Together with each report required to be filed under this section, 15 
developers and deployers shall file with the Attorney General a copy of the last 16 
completed independent audit required by this subchapter and a legal attestation 17 
that the automated decision system used in a consequential decision:  18 
(1)  does not violate any provision of this subchapter; or  19  BILL AS INTRODUCED 	H.340 
2025 	Page 17 of 23 
 
 
VT LEG #378965 v.1 
(2)  may violate or does violate one or more provisions of this article, 1 
that there is a plan of remediation to bring the automated decision system into 2 
compliance with this subchapter, and a summary of the plan of remediation. 3 
(c)  Developers of automated decision systems shall file with the Attorney 4 
General a report containing the following: 5 
(1)  a description of the system including: 6 
(A)  a description of the system’s software stack; 7 
(B)  the purpose of the system and its expected benefits; and 8 
(C)  the system’s current and intended uses, including what 9 
consequential decisions it will support and what stakeholders will be impacted; 10 
(2)  the intended outputs of the system and whether the outputs can be or 11 
are otherwise appropriate to be used for any purpose not previously articulated; 12 
(3)  the methods for training of their models including: 13 
(A)  any pre-processing steps taken to prepare datasets for the training 14 
of a model underlying an automated decision system; 15 
(B)  descriptions of the datasets upon which models were trained and 16 
evaluated, how and why datasets were collected and the sources of those 17 
datasets, and how that training data will be used and maintained; 18 
(C)  the quality and appropriateness of the data used in the automated 19 
decision system’s design, development, testing, and operation;   20  BILL AS INTRODUCED 	H.340 
2025 	Page 18 of 23 
 
 
VT LEG #378965 v.1 
(D)  whether the data contains sufficient breadth to address the range 1 
of real-world inputs the automated decision system might encounter and how 2 
any data gaps have been addressed; and 3 
(E)  steps taken to ensure compliance with privacy, data privacy, 4 
data security, and copyright laws; 5 
(4)  use and data management policies; 6 
(5)  any other information necessary to allow the deployer to understand 7 
the outputs and monitor the system for compliance with this subchapter; 8 
(6)  any other information necessary to allow the deployer to comply 9 
with the requirements of subsection (d) of this section; 10 
(7)  a description of the system’s capabilities and any developer-imposed 11 
limitations, including capabilities outside of its intended use, when the system 12 
should not be used, any safeguards or guardrails in place to protect against 13 
unintended, inappropriate, or disallowed uses, and testing of any safeguards or 14 
guardrails; 15 
(8)  an internal risk assessment including documentation and results of 16 
testing conducted to identify all reasonably foreseeable risks related to 17 
algorithmic discrimination, validity and reliability, privacy and autonomy, and 18 
safety and security, as well as actions taken to address those risks, and 19 
subsequent testing to assess the efficacy of actions taken to address risks; and 20  BILL AS INTRODUCED 	H.340 
2025 	Page 19 of 23 
 
 
VT LEG #378965 v.1 
(9)  whether the system should be monitored and, if so, how the system 1 
should be monitored. 2 
(d)  Deployers of automated decision systems used in consequential 3 
decisions shall file with the Attorney General a report containing the 4 
following: 5 
(1)  a description of the system, including: 6 
(A)  a description of the system’s software stack; 7 
(B)  the purpose of the system and its expected benefits; and 8 
(C)  the system’s current and intended uses, including what 9 
consequential decisions it will support and what stakeholders will be impacted; 10 
(2)  the intended outputs of the system and whether the outputs can be 11 
or are otherwise appropriate to be used for any purpose not previously 12 
articulated; 13 
(3)  whether the deployer collects revenue or plans to collect revenue 14 
from use of the automated decision system in a consequential decision and, if 15 
so, how it monetizes or plans to monetize use of the system; 16 
(4)  whether the system is designed to make consequential decisions 17 
itself or whether and how it supports consequential decisions; 18 
(5)  a description of the system’s capabilities and any deployer-imposed 19 
limitations, including capabilities outside of its intended use, when the system 20 
should not be used, any safeguards or guardrails in place to protect against 21  BILL AS INTRODUCED 	H.340 
2025 	Page 20 of 23 
 
 
VT LEG #378965 v.1 
unintended, inappropriate, or disallowed uses, and testing of any safeguards or 1 
guardrails; 2 
(6)  an assessment of the relative benefits and costs to the consumer 3 
given the system’s purpose, capabilities, and probable use cases; 4 
(7)  an internal risk assessment including documentation and results of 5 
testing conducted to identify all reasonably foreseeable risks related to 6 
algorithmic discrimination, accuracy and reliability, privacy and autonomy, 7 
and safety and security, as well as actions taken to address those risks, and 8 
subsequent testing to assess the efficacy of actions taken to address risks; and 9 
(8)  whether the system should be monitored and, if so, how the 10 
system should be monitored. 11 
(e)  The Attorney General shall: 12 
(1)  adopt rules:  13 
(A)  for a process whereby developers and deployers may request 14 
redaction of portions of reports required under this section to ensure that they 15 
are not required to disclose sensitive and protected information; and 16 
(B)  to determine reasonably foreseeable risks related to algorithmic 17 
discrimination, validity and reliability, privacy and autonomy, and safety and 18 
security, pursuant to subsections (c) and (d) of this section; and 19  BILL AS INTRODUCED 	H.340 
2025 	Page 21 of 23 
 
 
VT LEG #378965 v.1 
(2)  maintain an online database that is accessible to the general public 1 
with reports, redacted in accordance with this section, and audits required by 2 
this subchapter, which shall be updated biannually. 3 
(f)  For automated decision systems already in deployment for use in 4 
consequential decisions on or before July 1, 2025, developers and deployers 5 
shall not later than 18 months after July 1, 2025 complete and file the reports 6 
and complete the independent audit required by this subchapter. 7 
§ 4193g.  RISK MANAGEMENT POLICY AND PROGRAM 8 
(a)  Each developer or deployer of automated decision systems used in 9 
consequential decisions shall plan, document, and implement a risk 10 
management policy and program to govern development or deployment, as 11 
applicable, of the automated decision system.  The risk management policy and 12 
program shall specify and incorporate the principles, processes, and personnel 13 
that the deployer uses to identify, document, and mitigate known or reasonably 14 
foreseeable risks of algorithmic discrimination covered under section 4193b of 15 
this title.  The risk management policy and program shall be an iterative 16 
process planned, implemented, and regularly and systematically reviewed and 17 
updated over the life cycle of an automated decision system, requiring regular, 18 
systematic review and updates, including updates to documentation.  A risk 19 
management policy and program implemented and maintained pursuant to this 20 
subsection shall be reasonable considering the: 21  BILL AS INTRODUCED 	H.340 
2025 	Page 22 of 23 
 
 
VT LEG #378965 v.1 
(1)  guidance and standards set forth in version 1.0 of the Artificial 1 
Intelligence Risk Management Framework published by the National Institute 2 
of Standards and Technology in the U.S. Department of Commerce, or the 3 
latest version of the Artificial Intelligence Risk Management Framework 4 
published by the National Institute of Standards and Technology if, in the 5 
Attorney General’s discretion, the latest version of the Artificial Intelligence 6 
Risk Management Framework published by the National Institute of Standards 7 
and Technology in the U.S. Department of Commerce is at least as stringent as 8 
version 1.0; 9 
(2)  size and complexity of the developer or deployer; 10 
(3)  nature, scope, and intended uses of the automated decision system 11 
developed or deployed for use in consequential decisions; and 12 
(4)  sensitivity and volume of data processed in connection with 13 
the automated decision system. 14 
(b)  A risk management policy and program implemented pursuant to 15 
subsection (a) of this section may cover multiple automated decision systems 16 
developed by the same developer or deployed by the same deployer for use in 17 
consequential decisions if sufficient. 18 
(c)  The Attorney General may require a developer or a deployer to 19 
disclose the risk management policy and program implemented pursuant to 20  BILL AS INTRODUCED 	H.340 
2025 	Page 23 of 23 
 
 
VT LEG #378965 v.1 
subsection (a) of this section in a form and manner prescribed by the Attorney 1 
General.  The Attorney General may evaluate the risk management policy and 2 
program to ensure compliance with this section. 3 
§ 4193h.  ENFORCEMENT AND RULEMAKING 4 
(a)  A person who violates this subchapter or rules adopted pursuant to this 5 
subchapter commits an unfair and deceptive act in commerce in violation of 6 
section 2453 of this title (Vermont Consumer Protection Act).  A consumer 7 
harmed by a violation is eligible to all remedies provided under the Vermont 8 
Consumer Protection Act. 9 
(b)  The Attorney General has the same authority to adopt rules to 10 
implement the provisions of this section and to conduct civil investigations, 11 
enter into assurances of discontinuance, bring civil actions, and take other 12 
enforcement actions as provided under chapter 63, subchapter 1 of this title. 13 
Sec. 2. EFFECTIVE DATE 14 
This act shall take effect on July 1, 2025. 15