Alaska 2023-2024 Regular Session

Alaska House Bill HB306 Compare Versions

Only one version of the bill is available at this time.
OldNewDifferences
11
22 HB0306a -1- HB 306
33 New Text Underlined [DELETED TEXT BRACKETED]
44
55 33-LS1250\A
66
77
88
99
1010
1111 HOUSE BILL NO. 306
1212
1313 IN THE LEGISLATURE OF THE STATE OF ALASKA
1414
1515 THIRTY-THIRD LEGISLATURE - SECOND SESSION
1616
1717 BY THE HOUSE STATE AFFAIRS COMMITTEE BY REQUEST
1818
1919 Introduced: 2/2/24
2020 Referred: State Affairs, Judiciary
2121
2222
2323 A BILL
2424
2525 FOR AN ACT ENTITLED
2626
2727 "An Act relating to artificial intelligence; requiring disclosure of deepfakes in campaign 1
2828 communications; relating to cybersecurity; and relating to data privacy." 2
2929 BE IT ENACTED BY THE LEGISLATURE OF THE STATE OF ALASKA: 3
3030 * Section 1. AS 15.13 is amended by adding a new section to read: 4
3131 Sec. 15.13.093. Deepfake disclosure statement. (a) If a person knows or 5
3232 reasonably should know that a communication includes a deepfake depicting a 6
3333 candidate or political party in a manner intended to injure the reputation of the 7
3434 candidate or party or otherwise deceive a voter, the person shall include the following 8
3535 statement with the communication: "This communication has been manipulated or 9
3636 generated by artificial intelligence." In a communication that includes an audio 10
3737 component, the statement must be read in a manner that is easily heard. If the 11
3838 communication includes a print or video component, the statement must be placed in 12
3939 the communication so the statement is easily discernible, and, for a broadcast, cable, 13
4040 satellite, Internet, or other digital communication, the statement must remain onscreen 14 33-LS1250\A
4141 HB 306 -2- HB0306a
4242 New Text Underlined [DELETED TEXT BRACKETED]
4343
4444 throughout the entirety of the communication. 1
4545 (b) In this section, "deepfake" means an image, audio recording, or video 2
4646 recording of an individual's appearance, conduct, or spoken words that has been 3
4747 created or manipulated with machine learning, natural language processing, or another 4
4848 computational processing technique of similar or greater complexity in a manner to 5
4949 create a realistic but false image, audio, or video that 6
5050 (1) appears to a reasonable person to depict a real individual saying or 7
5151 doing something that did not actually occur; or 8
5252 (2) provides a fundamentally different understanding or impression of 9
5353 an individual's appearance, conduct, or spoken words than the understanding a 10
5454 reasonable person would have from an unaltered, original version of the media. 11
5555 * Sec. 2. AS 44.99 is amended by adding new sections to read: 12
5656 Article 7. Use by State Agencies of Artificial Intelligence and Data about Individuals. 13
5757 Sec. 44.99.700. Inventory. (a) Every two years, the department shall conduct 14
5858 an inventory of all systems used by state agencies that employ artificial intelligence 15
5959 for consequential decisions. Each state agency shall assist the department as necessary. 16
6060 An inventory must include, at a minimum, the following information for each system: 17
6161 (1) the name of the system; 18
6262 (2) the vendor that provides the system, if any; 19
6363 (3) a description of the general capabilities and uses of the system; and 20
6464 (4) whether the state agency completed an impact assessment of the 21
6565 system under AS 44.99.710 before the system's implementation. 22
6666 (b) Upon completion, the department shall publish each inventory on the 23
6767 department's Internet website. 24
6868 Sec. 44.99.710. Impact assessments. (a) At least once every two years, the 25
6969 head of a state agency that uses a system that employs artificial intelligence for 26
7070 consequential decisions shall conduct an impact assessment of the system. An impact 27
7171 assessment must include, at a minimum, an analysis of 28
7272 (1) the efficacy of the system; 29
7373 (2) the human oversight involved in the system; 30
7474 (3) the accountability mechanisms in place for the system;
7575 31 33-LS1250\A
7676 HB0306a -3- HB 306
7777 New Text Underlined [DELETED TEXT BRACKETED]
7878
7979 (4) the process by which an individual may appeal a decision made or 1
8080 facilitated by the system; 2
8181 (5) the current and potential benefits, liability, and risks to the state 3
8282 from the system, including risks related to cybersecurity and intellectual property and 4
8383 any measures used to mitigate liability and risks; 5
8484 (6) the current and potential effects of the system on the liberty, 6
8585 finances, livelihood, and privacy interests of individuals in the state, including effects 7
8686 from any use of geolocation data by the system; 8
8787 (7) any unlawful discrimination against or unlawful disparate impact 9
8888 on an individual or a group of individuals that has resulted or may result from the 10
8989 system; and 11
9090 (8) the policies and procedures that govern the process of using the 12
9191 system for consequential decisions. 13
9292 (b) Upon completion, the state agency that conducts the impact assessment 14
9393 shall provide the assessment to the department. Upon receiving an assessment, the 15
9494 department shall publish the assessment on the department's Internet website. 16
9595 Sec. 44.99.720. Requirements for use of artificial intelligence by state 17
9696 agencies. (a) A state agency that uses a system that employs artificial intelligence for 18
9797 consequential decisions shall 19
9898 (1) notify each individual who may be legally or significantly affected 20
9999 by the use of the system; 21
100100 (2) obtain an individual's consent before soliciting or acquiring 22
101101 sensitive personal data about the individual that will be used by the system; 23
102102 (3) provide an appeals process that includes manual human review for 24
103103 an individual who is legally or significantly affected by the use of the system; and 25
104104 (4) inform a prospective employee of the state agency about any video 26
105105 interview that involves the use of artificial intelligence and obtain the prospective 27
106106 employee's consent before employing artificial intelligence. 28
107107 (b) A state agency may not use a system that employs artificial intelligence for 29
108108 consequential decisions if the system involves 30
109109 (1) biometric identification, including facial recognition; 31 33-LS1250\A
110110 HB 306 -4- HB0306a
111111 New Text Underlined [DELETED TEXT BRACKETED]
112112
113113 (2) emotion recognition; 1
114114 (3) cognitive behavioral manipulation of individuals or groups; or 2
115115 (4) social scoring. 3
116116 (c) A state agency may not use a system that employs artificial intelligence for 4
117117 consequential decisions if the system uses data hosted in 5
118118 (1) the People's Republic of China, including the Hong Kong Special 6
119119 Administrative Region and Macao Special Administrative Region; 7
120120 (2) the Republic of Cuba; 8
121121 (3) the Islamic Republic of Iran; 9
122122 (4) the Democratic People's Republic of Korea; 10
123123 (5) the Russian Federation; or 11
124124 (6) the Bolivarian Republic of Venezuela under the regime of Nicolás 12
125125 Maduro Moros. 13
126126 (d) A state agency may contract with a person for a system that employs 14
127127 artificial intelligence for consequential decisions only if the person has implemented 15
128128 multi-factor authentication to secure the system and data stored by the system. 16
129129 Sec. 44.99.730. Transfer of data between state agencies. Unless required by 17
130130 law, a state agency may not transfer data about an individual to another state agency 18
131131 without the individual's consent. 19
132132 Sec. 44.99.740. Regulations. (a) The department shall adopt regulations under 20
133133 AS 44.62 (Administrative Procedure Act) concerning the development, procurement, 21
134134 implementation, use, and ongoing assessment of systems that employ artificial 22
135135 intelligence by state agencies for consequential decisions. The regulations must 23
136136 include, at a minimum, provisions that 24
137137 (1) govern the procurement, implementation, and ongoing assessment 25
138138 of each system; 26
139139 (2) require a state agency to conduct an impact assessment of each 27
140140 system under AS 44.99.710 before its implementation; 28
141141 (3) ensure that a system does not result in unlawful discrimination or 29
142142 an unlawful disparate impact on an individual or a group of individuals; and 30
143143 (4) provide for the ongoing assessment of each system. 31 33-LS1250\A
144144 HB0306a -5- HB 306
145145 New Text Underlined [DELETED TEXT BRACKETED]
146146
147147 (b) The department may adopt additional regulations under AS 44.62 1
148148 (Administrative Procedure Act) necessary to implement AS 44.99.700 - 44.99.730. 2
149149 Sec. 44.99.750. Civil liability for harm. (a) An individual who suffers harm 3
150150 as a result of a violation of AS 44.99.700 - 44.99.730, a violation of a regulation 4
151151 adopted under AS 44.99.740, or gross negligence or reckless or intentional misconduct 5
152152 relating to the use of artificial intelligence by a state agency or state employee may 6
153153 bring a civil action in the superior court against the state or state employee. 7
154154 (b) An individual who suffers harm under (a) of this section may recover 8
155155 damages for the harm to the individual, punitive damages under AS 09.17.020, and 9
156156 full reasonable attorney fees and costs in a civil action brought under this section. 10
157157 Sec. 44.99.760. Definitions. In AS 44.99.700 - 44.99.760, 11
158158 (1) "artificial intelligence" means an automated system that uses data 12
159159 input, human-defined objectives, and machine learning, natural language processing, 13
160160 or other computational processing techniques of similar or greater complexity to make 14
161161 a decision or facilitate human decision making; 15
162162 (2) "biometric identification" means the analysis of an individual's 16
163163 physical or behavioral characteristics to uniquely identify the individual; 17
164164 (3) "cognitive behavioral manipulation" means the use of a subliminal 18
165165 technique for the purpose of influencing an individual's behavior to achieve a desired 19
166166 outcome; 20
167167 (4) "consequential decision" means a conclusion, decision, or 21
168168 judgment by a state agency that can have a legal or significant effect on an individual; 22
169169 (5) "department" means the Department of Administration; 23
170170 (6) "emotion recognition" means the analysis of an individual's bodily 24
171171 expressions, including facial and verbal expressions, to identify or predict the 25
172172 individual's emotions; 26
173173 (7) "individual" means a natural person; 27
174174 (8) "sensitive personal data" means 28
175175 (A) data that reveals an individual's racial or ethnic origin, 29
176176 political opinions, or religious or philosophical beliefs; 30
177177 (B) an individual's genetic data; 31 33-LS1250\A
178178 HB 306 -6- HB0306a
179179 New Text Underlined [DELETED TEXT BRACKETED]
180180
181181 (C) an individual's biometric data when used for biometric 1
182182 identification; or 2
183183 (D) an individual's geolocation data; 3
184184 (9) "social scoring" means evaluating, classifying, rating, or scoring 4
185185 the trustworthiness or social standing of an individual based on behavior or 5
186186 socioeconomic, political, or religious status; 6
187187 (10) "state agency" means the University of Alaska, a public 7
188188 corporation of the state, or a department, institution, board, commission, division, 8
189189 authority, committee, or other administrative unit of the executive branch of state 9
190190 government. 10