1 of 1 HOUSE DOCKET, NO. 2261 FILED ON: 1/19/2023 HOUSE . . . . . . . . . . . . . . . No. 64 The Commonwealth of Massachusetts _________________ PRESENTED BY: Sean Garballey and Simon Cataldo _________________ To the Honorable Senate and House of Representatives of the Commonwealth of Massachusetts in General Court assembled: The undersigned legislators and/or citizens respectfully petition for the adoption of the accompanying bill: An Act establishing a commission on automated decision-making by government in the Commonwealth. _______________ PETITION OF: NAME:DISTRICT/ADDRESS :DATE ADDED:Sean Garballey23rd Middlesex1/19/2023Simon Cataldo14th Middlesex1/19/2023Vanna Howard17th Middlesex1/31/2023James B. EldridgeMiddlesex and Worcester3/13/2023 1 of 8 HOUSE DOCKET, NO. 2261 FILED ON: 1/19/2023 HOUSE . . . . . . . . . . . . . . . No. 64 By Representatives Garballey of Arlington and Cataldo of Concord, a petition (accompanied by bill, House, No. 64) of Sean Garballey, Simon Cataldo and Vanna Howard for legislation to establish a commission (including members of the General Court) relative to state agency automated decision-making, artificial intelligence, transparency, fairness, and individual rights. Advanced Information Technology, the Internet and Cybersecurity. [SIMILAR MATTER FILED IN PREVIOUS SESSION SEE HOUSE, NO. 4512 OF 2021-2022.] The Commonwealth of Massachusetts _______________ In the One Hundred and Ninety-Third General Court (2023-2024) _______________ An Act establishing a commission on automated decision-making by government in the Commonwealth. Be it enacted by the Senate and House of Representatives in General Court assembled, and by the authority of the same, as follows: 1 SECTION 1. Chapter 7D of the General Laws, as amended by chapter 64 of the acts of 22017, is hereby further amended by inserting after section 10 the following new section:- 3 Section 11. (a) As used in this section, the following words shall have the following 4meanings unless the context clearly requires otherwise: 5 “Algorithm”, a specific procedure, set of rules, or order of operations designed to solve a 6problem or make a calculation, classification, or recommendation. 2 of 8 7 “Artificial intelligence”, computerized methods and tools, including but not limited to 8machine learning and natural language processing, that act in a way that resembles human 9cognitive abilities when it comes to solving problems or performing certain tasks. 10 “Automated decision system”, any computer program, method, statistical model, or 11process that aims to aid or replace human decision-making using algorithms or artificial 12intelligence. These systems can include analyzing complex datasets about human populations 13and government services or other activities to generate scores, predictions, classifications, or 14recommendations used by agencies to make decisions that impact human welfare. 15 “Commonwealth of Massachusetts or “Massachusetts office”, any agency, constitutional 16office, department, board, commission, bureau, division or authority of the commonwealth, or of 17any political subdivision thereof, or of any authority established by the general court to serve a 18public purpose. 19 “Identified group characteristic", age, race, creed, color, religion, national origin, gender, 20disability, sexual orientation, marital status, veteran status, receipt of public assistance, economic 21status, location of residence, or citizenship status. 22 “Source code”, the structure of a computer program that can be read and understood by 23people. 24 “Training data”, the data used to inform the development of an automated decision 25system and the decisions or recommendations it generates. 26 (b) There shall be a commission within the executive office of technology services and 27security for the purpose of studying and making recommendations relative to the use by the 3 of 8 28commonwealth of automated decision systems that may affect human welfare, including but not 29limited to the legal rights and privileges of individuals. The commission shall evaluate 30government use of automated decision systems in the commonwealth and make 31recommendations to the legislature regarding appropriate regulations, limits, standards and 32safeguards. The commission shall: 33 (i) undertake a complete and specific survey of all uses of automated decision systems by 34the commonwealth of Massachusetts and the purposes for which such systems are used, 35including but not limited to: 36 (a) the principles, policies, and guidelines adopted by specific Massachusetts offices to 37inform the procurement, evaluation, and use of automated decision systems, and the procedures 38by which such principles, policies, and guidelines are adopted; 39 (b) the training specific Massachusetts offices provide to individuals using automated 40decision systems, and the procedures for enforcing the principles, policies, and guidelines 41regarding their use; 42 (c) the manner by which Massachusetts offices validate and test the automated decision 43systems they use, and the manner by which they evaluate those systems on an ongoing basis, 44specifying the training data, input data, systems analysis, studies, vendor or community 45engagement, third-parties, or other methods used in such validation, testing, and evaluation; 46 (d) matters related to the transparency, explicability, auditability, and accountability of 47automated decision systems in use in Massachusetts offices, including information about their 48structure; the processes guiding their procurement, implementation and review; whether they can 4 of 8 49be audited externally and independently; and the people who operate such systems and the 50training they receive; 51 (e) the manner and extent to which Massachusetts offices make the automated decision 52systems they use available to external review, and any existing policies, laws, procedures, or 53guidelines that may limit external access to data or technical information that is necessary for 54audits, evaluation, or validation of such systems; and 55 (f) procedures and policies in place to protect the due process rights of individuals 56directly affected by Massachusetts offices’ use of automated decision systems, including but not 57limited to public disclosure and transparency procedures; 58 (ii) consult with experts in the fields of machine learning, algorithmic bias, algorithmic 59auditing, and civil and human rights; 60 (iii) examine research related to the use of automated decision systems that directly or 61indirectly result in disparate outcomes for individuals or communities based on an identified 62group characteristic; 63 (iv) conduct a survey of technical, legal, or policy controls to improve the just and 64equitable use of automated decision systems and mitigate any disparate impacts deriving from 65their use, including best practices, policy tools, laws, and regulations developed through research 66and academia or proposed or implemented in other states and jurisdictions; 67 (v) examine matters related to data sources, data sharing agreements, data security 68provisions, compliance with data protection laws and regulations, and all other issues related to 5 of 8 69how data is protected, used, and shared by agencies using automated decision systems, in 70Massachusetts and in other jurisdictions; 71 (vi) examine matters related to automated decision systems and intellectual property, 72such as the existence of non-disclosure agreements, trade secrets claims, and other proprietary 73interests, and the impacts of intellectual property considerations on transparency, explicability, 74auditability, accountability, and due process; and 75 (vii) examine any other opportunities and risks associated with the use of automated 76decision systems by Massachusetts offices. 77 (c) The commission shall consist of the secretary of technology services and security or 78the secretary’s designee, who shall serve as chair; 1 member of the Senate, designated by the 79Senate president; 1 member of the house of representatives, designated by the speaker of the 80house of representatives; the house and senate chairs of the joint committee on state 81administration and regulatory oversight; the chief justice of the supreme judicial court or a 82designee; the attorney general or a designee; the state auditor or a designee; the inspector general 83or a designee; the secretaries of the Executive Office of Public Safety and Security, and 84Executive Office of Health and Human Services, or their designees; the Commissioner of the 85Department of Children and Families, or their designee; the chief counsel of the committee for 86public counsel services or a designee; the chief legal counsel of the Massachusetts Bar 87Association or a designee; the executive director of the American Civil Liberties Union of 88Massachusetts or a designee; 6 representatives from academic institutions in the Commonwealth 89who shall be experts in (i) artificial intelligence and machine learning, (ii) data science and 90information policy, (iii) social implications of artificial intelligence and technology; or (iv) 6 of 8 91technology and the law, 3 to be appointed by the House Chair and 3 to be appointed by the 92Senate Chair of the joint committee on advanced information technology and cybersecurity; the 93executive director of the Massachusetts Law Reform Institute or a designee; 1 representative 94from the National Association of Social Workers; 1 representative from the NAACP; 5 95representatives from the Massachusetts Technology Collaborative; and 1 representative from the 96Massachusetts High Technology Council. 97 (d) Members of the commission shall be appointed within 45 days of the effective date of 98this act. The commission shall meet at the call of the chair based on the commission’s workload 99but not fewer than 10 times per calendar year. The commission shall hold at least one public 100hearing to solicit feedback from Massachusetts residents and other interested parties. The 101commission’s meetings shall be broadcast over the internet. 102 (e) The commission shall submit an annual report by December 31 to the governor, the 103clerks of the house of representatives and the senate, and the joint committee on advanced 104information technology and cybersecurity. The report will be a public record and it shall include, 105but not be limited to: 106 (i) a description of the commission’s activities and any community engagement 107undertaken by the commission; 108 (ii) the commission's findings, including but not limited to the publication of a list of all 109automated decision systems in use in Massachusetts offices, the policies, procedures, and 110training guidelines in place to govern their use, and any contracts with third parties pertaining to 111the acquisition or deployment of such systems; and 7 of 8 112 (iii) any recommendations for regulatory or legislative action, including but not limited to 113the following: 114 (a) recommendations about areas where Massachusetts offices ought not to use 115automated decision systems; 116 (b) recommendations about whether and how existing state laws, regulations, programs, 117policies, and practices related to the use of automated decision systems should be amended to 118promote racial and economic justice, equity, fairness, accountability, and transparency; 119 (c) recommendations for the development and implementation of policies and procedures 120that may be used by the state for the following purposes: 121 (i) to allow a person affected by a rule, policy, or action made by, or with the assistance 122of, an automated decision system, to request and receive an explanation of such rule, policy, or 123action and the basis therefor; 124 (ii) to determine whether an automated decision system disproportionately or unfairly 125impacts a person or group based on an identified group characteristic; 126 (iii) to determine prior to or during the procurement or acquisition process whether a 127proposed agency automated decision system is likely to disproportionately or unfairly impact a 128person or group based on an identified group characteristic; 129 (iv) to address instances in which a person or group is harmed by an agency automated 130decision system if any such system is found to disproportionately impact a person or group on 131the basis of an identified group characteristic; and 8 of 8 132 (v) to make information publicly available that, for each automated decision system, will 133allow the public to meaningfully assess how such system functions and is used by the state, 134including making technical information about such system publicly available.