Massachusetts 2023-2024 Regular Session

Massachusetts House Bill H4024 Compare Versions

Only one version of the bill is available at this time.
OldNewDifferences
11 HOUSE . . . . . . . . No. 4024
22 The Commonwealth of Massachusetts
33 ________________________________________
44 HOUSE OF REPRESENTATIVES, August 3, 2023.
55 The committee on Advanced Information Technology, the Internet and
66 Cybersecurity to whom was referred the petition (accompanied by bill,
77 Senate, No. 33) of Jason M. Lewis and Michael O. Moore for legislation
88 to establish a commission on automated decision-making by government
99 in the commonwealth; and the petition (accompanied by bill, House, No.
1010 64) of Sean Garballey, Simon Cataldo and Vanna Howard for legislation
1111 to establish a commission (including members of the General Court)
1212 relative to state agency automated decision-making, artificial intelligence,
1313 transparency, fairness, and individual rights, reports recommending that
1414 the accompanying bill (House, No. 4024) ought to pass.
1515 For the committee,
1616 TRICIA FARLEY-BOUVIER. 1 of 5
1717 FILED ON: 7/31/2023
1818 HOUSE . . . . . . . . . . . . . . . No. 4024
1919 The Commonwealth of Massachusetts
2020 _______________
2121 In the One Hundred and Ninety-Third General Court
2222 (2023-2024)
2323 _______________
2424 An Act establishing a commission on automated decision-making by government in the
2525 Commonwealth.
2626 Be it enacted by the Senate and House of Representatives in General Court assembled, and by the authority
2727 of the same, as follows:
2828 1 SECTION 1. (a) As used in this section, the following words shall, unless the context
2929 2clearly requires otherwise. have the following meanings:
3030 3 “Algorithm”, a specific procedure, set of rules, or order of operations designed to solve a
3131 4problem or make a calculation, classification, or recommendation.
3232 5 “Artificial intelligence”, computerized methods and tools, including but not limited to
3333 6machine learning and natural language processing, that act in a way that resembles human
3434 7cognitive abilities when it comes to solving problems or performing certain tasks.
3535 8 “Automated decision system”, any computer program, method, statistical model, or
3636 9process that aims to aid or replace human decision-making using algorithms or artificial
3737 10intelligence. These systems can include analyzing complex datasets about human populations
3838 11and government services or other activities to generate scores, predictions, classifications, or
3939 12recommendations used by agencies to make decisions that impact human welfare. 2 of 5
4040 13 “Executive agency” a state agency within the office of the governor
4141 14 “Identified group characteristic", age, race, creed, color, religion, national origin, gender,
4242 15disability, sexual orientation, marital status, veteran status, receipt of public assistance, economic
4343 16status, location of residence, or citizenship status.
4444 17 “Source code”, the structure of a computer program that can be read and understood by
4545 18people.
4646 19 “Training data”, the data used to inform the development of an automated decision
4747 20system and the decisions or recommendations it generates.
4848 21 (b) Notwithstanding any special or general law to the contrary, there shall be a special
4949 22legislative commission established pursuant to section 2A of chapter 4 of the General Laws to
5050 23conduct a study on the use of automated decision systems by executive agencies.
5151 24 The commission shall consist of 11 members: 2 of whom shall be the chairs of the joint
5252 25committee on advanced information technology the internet and cybersecurity, who shall serve
5353 26as co-chairs; 1 of whom appointed by the speaker of the house of representatives; 1 of whom
5454 27shall be appointed by the president of the senate; 1 of whom shall be the secretary of the
5555 28executive office of technology services and security, or a designee; 1 of whom shall be the
5656 29attorney general or a designee; 1 of whom shall be the executive director of the American Civil
5757 30Liberties Union of Massachusetts or a designee; 2 of whom shall be appointed by the Governor
5858 31and shall work at academic institutions in the Commonwealth in the field of (i) artificial
5959 32intelligence and machine learning, (ii) data science and information policy, (iii) social
6060 33implications of artificial intelligence and technology; or (iv) technology and the law; 1 of whom 3 of 5
6161 34shall be a member of the Massachusetts High Technology Council; and 1 of whom shall be a
6262 35member of the Massachusetts Technology Collaborative.
6363 36 (c) . The commission shall study the use of automated decision systems by executive
6464 37agencies and make recommendations to the legislature regarding appropriate regulations, limits,
6565 38standards, and safeguards. The commission shall:
6666 39 (i) survey the current use of automated decision systems by executive agencies and the
6767 40purposes for which such systems are used, including but not limited to:
6868 41 (A) the principles, policies, and guidelines adopted by executive agencies to inform the
6969 42procurement, evaluation, and use of automated decision systems, and the procedures by which
7070 43such principles, policies, and guidelines are adopted;
7171 44 (B) the training executive agencies provide to individuals using automated decision
7272 45systems, and the procedures for enforcing the principles, policies, and guidelines regarding their
7373 46use;
7474 47 (C) the manner by which executive agencies validate and test the automated decision
7575 48systems they use, and the manner by which they evaluate those systems on an ongoing basis,
7676 49specifying the training data, input data, systems analysis, studies, vendor or community
7777 50engagement, third-parties, or other methods used in such validation, testing, and evaluation;
7878 51 (D) the manner and extent to which executive agencies make the automated decision
7979 52systems they use available to external review, and any existing policies, laws, procedures, or
8080 53guidelines that may limit external access to data or technical information that is necessary for
8181 54audits, evaluation, or validation of such systems; and 4 of 5
8282 55 (E) procedures and policies in place to protect the due process rights of individuals
8383 56directly affected by the use of automated decision systems;
8484 57 (ii) consult with experts in the fields of machine learning, algorithmic bias, algorithmic
8585 58auditing, and civil and human rights;
8686 59 (iii) examine research related to the use of automated decision systems that directly or
8787 60indirectly result in disparate outcomes for individuals or communities based on an identified
8888 61group characteristic;
8989 62 (iv) conduct a survey of technical, legal, or policy controls to improve the just and
9090 63equitable use of automated decision systems and mitigate any disparate impacts deriving from
9191 64their use, including best practices, policy tools, laws, and regulations developed through research
9292 65and academia or proposed or implemented in other states and jurisdictions;
9393 66 (v) examine matters related to data sources, data sharing agreements, data security
9494 67provisions, compliance with data protection laws and regulations, and all other issues related to
9595 68how data is protected, used, and shared by executive agencies using automated decision systems;
9696 69 (vi) examine any other opportunities and risks associated with the use of automated
9797 70decision systems.
9898 71 (vii) evaluate evidence based best practices for the use of automated decision systems;
9999 72 (viii) make recommendations for regulatory or legislative action, if any;
100100 73 (ix) make recommendations about if and how existing state laws, regulations, programs,
101101 74policies, and practices related to the use of automated decision systems should be amended to
102102 75promote racial and economic justice, equity, fairness, accountability, and transparency; 5 of 5
103103 76 (x) make recommendations for the development and implementation of policies and
104104 77procedures that may be used by the state for the following purposes:
105105 78 (A) to allow a person affected by a rule, policy, or action made by, or with the assistance
106106 79of, an automated decision system, to request and receive an explanation of such rule, policy, or
107107 80action and the basis therefor;
108108 81 (B) to determine whether an automated decision system disproportionately or unfairly
109109 82impacts a person or group based on an identified group characteristic;
110110 83 (C) to determine prior to or during the procurement or acquisition process whether a
111111 84proposed agency automated decision system is likely to disproportionately or unfairly impact a
112112 85person or group based on an identified group characteristic;
113113 86 (D) to address instances in which a person or group is harmed by an agency automated
114114 87decision system if any such system is found to disproportionately impact a person or group on
115115 88the basis of an identified group characteristic.
116116 89 (d) The commission shall submit its report and recommendations, including any proposed
117117 90legislation, with the governor, the clerks of the house of representatives and the senate, and the
118118 91joint committee on advanced information technology and cybersecurity on or before December
119119 9231, 2023.