New York 2023-2024 Regular Session

New York Assembly Bill A09430 Latest Draft

Bill / Amended Version Filed 03/14/2024

   
  STATE OF NEW YORK ________________________________________________________________________ 9430--A  IN ASSEMBLY March 14, 2024 ___________ Introduced by M. of A. OTIS, SANTABARBARA -- read once and referred to the Committee on Science and Technology -- committee discharged, bill amended, ordered reprinted as amended and recommitted to said commit- tee AN ACT to amend the state technology law, in relation to automated deci- sion-making by state agencies The People of the State of New York, represented in Senate and Assem- bly, do enact as follows: 1 Section 1. Short title. This act shall be known and may be cited as 2 the "legislative oversight of automated decision-making in government 3 act (LOADinG Act)". 4 § 2. The state technology law is amended by adding a new article 4 to 5 read as follows: 6 ARTICLE IV 7 AUTOMATED DECISION-MAKING IN STATE GOVERNMENT 8 Section 401. Definitions. 9 402. Use of automated decision-making systems by agencies. 10 403. Impact assessments. 11 404. Submission to the governor and legislature. 12 § 401. Definitions. For the purpose of this article: 13 1. "Automated decision-making system" shall mean any software that 14 uses algorithms, computational models, or artificial intelligence tech- 15 niques, or a combination thereof, to automate, support, or replace human 16 decision-making and shall include, without limitation, systems that 17 process data, and apply predefined rules or machine learning algorithms 18 to analyze such data, and generate conclusions, recommendations, 19 outcomes, assumptions, projections, or predictions without meaningful 20 human review and discretion. "Automated decision-making system" shall 21 not include any software used primarily for basic computerized proc- 22 esses, such as calculators, spellcheck tools, autocorrect functions, 23 spreadsheets, electronic communications, or any tool that relates only 24 to internal management affairs such as ordering office supplies or proc- 25 essing payments, and that do not materially affect the rights, liber- 26 ties, benefits, safety or welfare of any individual within the state. EXPLANATION--Matter in italics (underscored) is new; matter in brackets [ ] is old law to be omitted. LBD11734-05-4 

 A. 9430--A 2 1 2. "Meaningful human review" means review or oversight by one or more 2 individuals who understand the risks and shortcomings of, and are 3 trained to use the automated decision-making system and who have the 4 authority to alter the decision under review. 5 3. "State agency" shall mean any department, public authority, board, 6 bureau, commission, division, office, council, committee or officer of 7 the state. Such terms shall not include the legislature or judiciary. 8 4. "Public assistance benefit" shall mean any service or program with- 9 in the control of the state, or benefit provided by the state to indi- 10 viduals or households, including but not limited to public assistance, 11 cash assistance, grants, child care assistance, housing assistance, 12 unemployment benefits, transportation benefits, education assistance, 13 domestic violence services, and any other assistance or benefit within 14 the authority of the state to grant to individuals within the state. 15 This shall not include any federal program that is administered by the 16 federal government or the state. 17 § 402. Use of automated decision-making systems by agencies. 1. No 18 state agency, or any entity acting on behalf of such agency, which 19 utilizes or applies any automated decision-making system, directly or 20 indirectly, in performing any function that: (a) is related to the 21 delivery of any public assistance benefit; (b) will have a material 22 impact on the rights, civil liberties, safety or welfare of any individ- 23 ual within the state; or (c) affects any statutorily or constitutionally 24 provided right of an individual, shall utilize such automated decision- 25 making system, unless such automated decision-making system is subject 26 to meaningful human review. 27 2. No state agency shall authorize any procurement, purchase or acqui- 28 sition of any service or system utilizing, or relying on, automated 29 decision-making systems in performing any function that is: (a) related 30 to the delivery of any public assistance benefit; (b) will have a mate- 31 rial impact on the rights, civil liberties, safety or welfare of any 32 individual within the state; or (c) affects any statutorily or constitu- 33 tionally provided right of an individual unless such automated deci- 34 sion-making system is subject to meaningful human review. 35 § 403. Impact assessments. 1. State agencies seeking to utilize or 36 apply an automated decision-making system shall conduct or have 37 conducted an impact assessment for the application and use of such auto- 38 mated decision-making system. Following the first impact assessment, an 39 impact assessment shall be conducted at least once every two years. An 40 impact assessment shall be conducted prior to any material change to the 41 automated decision-making system that may change the outcome or effect 42 of such system. Such impact assessments shall include: 43 (a) a description of the objectives of the automated decision-making 44 system; 45 (b) an evaluation of the ability of the automated decision-making 46 system to achieve its stated objectives; 47 (c) a description and evaluation of the objectives and development of 48 the automated decision-making including: 49 (i) a summary of the underlying algorithms, computational modes, and 50 artificial intelligence tools that are used within the automated deci- 51 sion-making system; and 52 (ii) the design and training data used to develop the automated deci- 53 sion-making system process; 54 (d) testing for: 55 (i) accuracy, fairness, bias and discrimination, and an assessment of 56 whether the use of the automated decision-making system produces discri- 

 A. 9430--A 3 1 minatory results on the basis of a consumer's or a class of consumers' 2 actual or perceived race, color, ethnicity, religion, national origin, 3 sex, gender, gender identity, sexual orientation, familial status, biom- 4 etric information, lawful source of income, or disability and outlines 5 mitigations for any identified performance differences in outcomes 6 across relevant groups impacted by such use; 7 (ii) any cybersecurity vulnerabilities and privacy risks resulting 8 from the deployment and use of the automated decision-making system, and 9 the development or existence of safeguards to mitigate the risks; 10 (iii) any public health or safety risks resulting from the deployment 11 and use of the automated decision-making system; 12 (iv) any reasonably foreseeable misuse of the automated decision-mak- 13 ing system and the development or existence of safeguards against such 14 misuse; 15 (e) the extent to which the deployment and use of the automated deci- 16 sion-making system requires input of sensitive and personal data, how 17 that data is used and stored, and any control users may have over their 18 data; and 19 (f) the notification mechanism or procedure, if any, by which individ- 20 uals impacted by the utilization of the automated decision-making system 21 may be notified of the use of such automated decision-making system and 22 of the individual's personal data, and informed of their rights and 23 options relating to such use. 24 2. Notwithstanding the provisions of this article or any other law, if 25 an impact assessment finds that the automated decision-making system 26 produces discriminatory or biased outcomes, the state agency shall cease 27 any utilization, application, or function of such automated decision- 28 making system, and of any information produced using such system. 29 § 404. Submission to the governor and legislature. 1. Each impact 30 assessment conducted pursuant to this article shall be submitted to the 31 governor, the temporary president of the senate, and the speaker of the 32 assembly at least thirty days prior to the implementation of the auto- 33 mated decision-making system that is the subject of such assessment. 34 2. The impact assessment of an automated decision-making system shall 35 be published on the website of the relevant agency. If the state agency 36 makes a determination that the disclosure of any information required in 37 the impact assessment would result in a substantial negative impact on 38 health or safety of the public, infringe upon the privacy rights of 39 individuals, or significantly impair the state agency's ability to 40 protect its information technology or operational assets, it may redact 41 such information, provided that an explanatory statement on the process 42 by which the state agency made such determination is published along 43 with the redacted impact assessment. 44 § 3. Disclosure of existing automated decision-making systems. Any 45 state agency, that directly or indirectly, utilizes an automated deci- 46 sion-making system, as defined in section 401 of the state technology 47 law, shall submit to the legislature a disclosure on the use of such 48 system, no later than one year after the effective date of this section. 49 Such disclosure shall include: 50 (a) a description of the automated decision-making system utilized by 51 such agency; 52 (b) a list of any software vendors related to such automated deci- 53 sion-making system; 54 (c) the date that the use of such system began; 

 A. 9430--A 4 1 (d) a summary of the purpose and use of such system, including a 2 description of human decision-making and discretion supported or 3 replaced by the automated decision-making system; 4 (e) whether any impact assessments for the automated decision-making 5 system were conducted and the dates and summaries of the results of such 6 assessments where applicable; and 7 (f) any other information deemed relevant by the agency. 8 § 4. This act shall take effect immediately, provided that section two 9 of this act shall take effect one year after it shall have become a law.