Georgia 2025-2026 Regular Session

Georgia Senate Bill SB167 Latest Draft

Bill / Introduced Version Filed 02/13/2025

                            25 LC 56 0255
Senate Bill 167
By: Senators Merritt of the 9th, Jones II of the 22nd, Kemp of the 38th, Islam Parkes of the
7th, Rhett of the 33rd and others 
A BILL TO BE ENTITLED
AN ACT
To amend Title 10 of the Official Code of Georgia Annotated, relating to commerce and
1
trade, so as to provide broadly for private entities that employ certain AI systems to guard2
against discrimination caused by such systems; to provide for a description of consequential3
decisions for which use of automated decision systems shall be regulated; to provide for4
developers and deployers to perform certain evaluations of the automated decision systems5
they employ; to provide for notice to consumers when certain decisions are made using an6
automated decision system; to provide for certain disclosures by developers of AI systems;7
to provide for certain disclosures by deployers of AI systems; to provide for annual updates8
to certain disclosures; to provide for notices to a consumer each time a decision is made9
using an automated decision system; to provide requirements for such disclosures by10
developers and deployers; to provide for exemptions; to provide for trade secret protections;11
to provide for rule making; to provide for certain disclosed records by developers and12
deployers to be exempt from open records requirements; to provide for enforcement by the13
Attorney General; to provide for definitions; to provide for related matters; to repeal14
conflicting laws; and for other purposes.15
BE IT ENACTED BY THE GENERAL ASSEMBLY OF GEORGIA:16
S. B. 167
- 1 - 25 LC 56 0255
SECTION 1.
17
Title 10 of the Official Code of Georgia Annotated, relating to commerce and trade, is18
amended by adding a new chapter to read as follows:19
"CHAPTER 16
20
10-16-1.21
As used in this chapter, the term:22
(1)(A)  'Algorithmic discrimination' means the use of an artificial intelligence system23
in a manner that discriminates, causes a disparate impact, or otherwise makes24
unavailable the equal enjoyment of goods, services, or other activities or opportunities25
as related to a consequential decision on the basis of actual or perceived age, color,26
disability, ethnicity, genetic information, limited proficiency in the English language,27
national origin, race, religion, pursuit or receipt of reproductive healthcare, sex, sexual28
orientation, gender identity, veteran status, or other classification protected under the29
laws of this state or federal law.30
(B)  Such term shall not include:31
(i)  The offer, license, or use of an automated decision system by a developer or32
deployer for the sole purpose of:33
(I) The developer's or deployer's self-testing to identify, mitigate, or prevent34
discrimination or otherwise ensure compliance with state and federal law; or35
(II)  Expanding an applicant, customer, or participant pool to increase diversity or36
redress historical discrimination; or37
(ii)  An act or omission by or on behalf of a private club or other establishment that38
is not in fact open to the public, as set forth in Title II of the federal Civil Rights Act39
of 1964, 42 U.S.C. Section 2000a(e), as amended.40
S. B. 167
- 2 - 25 LC 56 0255
(2)  'Artificial intelligence system' or 'AI system' means an engineered or machine based41
system that emulates the capability of a person to receive audio, visual, text, or any other42
form of information and use the information received to emulate a human cognitive43
process, including, but not limited to, learning, generalizing, reasoning, planning,44
predicting, acting, or communicating; provided, however, that artificial intelligence45
systems may vary in the forms of information they can receive and in the human46
cognitive processes they can emulate.47
(3)(A)  'Automated decision system' means a computational process derived from48
machine learning, statistical modeling, data analytics, or an artificial intelligence system49
that, when deployed, issues a simplified output, including, but not limited to, a score,50
classification, or recommendation, that is used to assist or replace human discretionary51
decision making and materially impacts natural persons.52
(B)  Such term shall not include a tool that does not assist or replace processes for53
making consequential decisions and that does not materially impact natural persons,54
including, but not limited to, a junk email filter, firewall, antivirus software, calculator,55
spreadsheet, or other tool that does no more than organize data already in possession56
of the deployer of the automated decision system.57
(4)  'Consequential decision' means a decision that has a material effect on the provision58
or denial to any consumer of, or on the cost or terms of:59
(A)  Education enrollment or education opportunities;60
(B)  Employment or employment opportunities;61
(C)  Essential government services;62
(D)  Financial or lending services;63
(E)  Healthcare services;64
(F)  Housing;65
(G)  Insurance; or66
(H)  Legal services.67
S. B. 167
- 3 - 25 LC 56 0255
(5)  'Consumer' means an individual who is a Georgia resident.68
(6)  'Deploy' means to use an automated decision system.69
(7)  'Deployer' means a person doing business in this state that deploys an automated70
decision system.71
(8)  'Developer' means a person doing business in this state that develops or intentionally72
and substantially modifies an artificial intelligence system.73
(9)  'Healthcare services' shall have the same meaning as set forth in 42 U.S.C. Section74
234(d)(2).75
(10)(A)  'Intentional and substantial modification' or 'intentionally and substantially76
modify' means a deliberate change made to an AI system that results in any increase in77
or new reasonably foreseeable risk of algorithmic discrimination by such AI system.78
(B)  Such term shall not include a change made to an automated decision system if:79
(i)  The automated decision system continues to learn after the automated decision80
system is:81
(I)  Offered, sold, leased, licensed, given, or otherwise made available to a deployer;82
or83
(II)  Deployed;84
(ii)  The change is made to the automated decision system as a result of any learning85
described in division (i) of this subparagraph;86
(iii)  The change was predetermined by the deployer, or a third party contracted by the87
deployer, when the deployer or third party completed an initial impact assessment of88
such automated decision system pursuant to subsection (e) of Code Section 10-16-3;89
and90
(iv)  The change is included in technical documentation for the automated decision91
system.92
(11)  'Personal data' means any information, including derived data and unique identifiers,93
that is linked or reasonably linkable, alone or in combination with other information, to94
S. B. 167
- 4 - 25 LC 56 0255
an identified or identifiable individual or a device that identifies or is linked or reasonably95
linkable to an individual.96
(12)  'Trade secret' shall have the same meaning as set forth in Code Section 10-1-761.97
10-16-2.98
(a) No developer shall sell, distribute, or otherwise make available to deployers an99
automated decision system that results in algorithmic discrimination.100
(b)  Except as provided in subsection (f) of this Code section, a developer of an automated101
decision system shall provide certain information regarding such automated decision102
system to the Attorney General, in a form and manner prescribed by the Attorney General. 103
Such information shall include, at a minimum:104
(1)  A general statement describing the reasonably foreseeable uses and known harmful105
or inappropriate uses of the automated decision system;106
(2)  Documentation disclosing:107
(A)  The purpose of the automated decision system;108
(B)  The intended benefits and uses of the automated decision system;109
(C)  High-level summaries of the types of data used to train the automated decision110
system;111
(D)  Known or reasonably foreseeable limitations of the automated decision system,112
including known or reasonably foreseeable risks of algorithmic discrimination arising113
from the intended uses of the automated decision system;114
(E)  The measures the developer has taken to mitigate known or reasonably foreseeable115
risks of algorithmic discrimination;116
(F)  How the automated decision system was evaluated for performance and mitigation117
of algorithmic discrimination before the automated decision system was offered, sold,118
leased, licensed, given, or otherwise made available to the deployer;119
S. B. 167
- 5 - 25 LC 56 0255
(G) The data governance measures used to cover the training data sets and the120
measures used to examine the suitability of data sources, possible biases, and121
appropriate mitigation;122
(H)  How the automated decision system should be used, not be used, and be monitored123
by an individual when the automated decision system is used to make, or assist in124
making, a consequential decision; and125
(I) All other information necessary to allow the deployer to comply with the126
requirements of Code Section 10-16-3; and127
(3)  Any additional documentation that is reasonably necessary to assist the deployer in128
understanding the outputs and monitoring the performance of the automated decision129
system for risks of algorithmic discrimination.130
(c)(1)  Except as provided in subsection (f) of this Code section, a developer that offers,131
sells, leases, licenses, gives, or otherwise makes available to a deployer or other132
developer an automated decision system shall make available to the deployer or other133
developer, to the extent feasible, all of the information required to be provided to the134
Attorney General by subsection (b) of this Code section, as well as the documentation135
and information, through artifacts such as model cards, data set cards, or other impact136
assessments, necessary for a deployer or third party contracted by a deployer to complete137
an impact assessment pursuant to subsection (e) of Code Section 10-16-3.138
(2)  A developer that also serves as a deployer for an automated decision system is not139
required to generate the documentation required by this subsection unless the automated140
decision system is provided to an unaffiliated entity acting as a deployer.141
(d)(1)  A developer shall make available to the public, in a manner that is clear and142
readily available on the developer's public website or in a public use case inventory, a143
statement summarizing:144
S. B. 167
- 6 - 25 LC 56 0255
(A)  The types of automated decision systems that the developer has developed or145
intentionally and substantially modified and currently makes available to a deployer or146
other developer; and147
(B)  How the developer manages known or reasonably foreseeable risks of algorithmic148
discrimination.149
(2)  A developer shall update the statement described in paragraph (1) of this subsection:150
(A)  As necessary to ensure that the statement remains accurate; and151
(B)  No later than 90 days after the developer intentionally and substantially modifies152
any automated decision system described in such statement.153
(e)(1)  A developer of an automated decision system shall take steps to address risks of154
algorithmic discrimination, invalidity, and errors, including, but not limited to, ensuring155
suitability and representativeness of data sources, implementing data governance156
measures, testing the automated decision system for disparate impact, and searching for157
less discriminatory alternative decision methods.  Developers shall continue assessing158
and mitigating the risk of algorithmic discrimination in their automated decision systems159
so long as such automated decision systems are in use by any deployer.160
(2)  A developer of an automated decision system shall disclose to the Attorney General,161
in a form and manner prescribed by the Attorney General, and to all known deployers or162
other developers of the automated decision system, any known or reasonably foreseeable163
risks of algorithmic discrimination arising from the intended uses of the automated164
decision system without unreasonable delay but no later than 90 days after the date on165
which:166
(A)  The developer discovers through the developer's ongoing testing and analysis that167
the developer's automated decision system has been deployed and has caused or is168
reasonably likely to have caused algorithmic discrimination; or169
(B)  The developer receives from a deployer a credible report that the automated170
decision system has been deployed and has caused algorithmic discrimination.171
S. B. 167
- 7 - 25 LC 56 0255
(f)(1)  A developer who discloses information to a deployer, to a consumer, or to the172
general public pursuant to subsections (b) through (e) of this Code section may make173
reasonable redactions for the purpose of protecting trade secrets.174
(2)  No developer shall redact information from its required disclosures to a deployer175
under this Code section if the information is necessary for the deployer to comply with176
its disclosure, explanation, impact assessment, or audit obligations under this chapter.177
(3)  To the extent that a developer redacts information pursuant to paragraph (1) of this178
subsection, the developer shall notify the subjects of the disclosure and provide a basis179
for the redaction.180
(g)  The Attorney General may require that a developer disclose to the Attorney General,181
within seven days and in a form and manner prescribed by the Attorney General, any182
documentation or records required by this Code section, including, but not limited to, the183
statement or documentation described in subsection (b) of this Code section.  The Attorney184
General may evaluate such statement or documentation to ensure compliance with this185
chapter, and, notwithstanding the provisions of Article 4 of Chapter 18 of Title 50, relating186
to open records, such records shall not be open to inspection by or made available to the187
public.  In a disclosure pursuant to this subsection, a developer may designate the statement188
or documentation as including proprietary information or a trade secret.  To the extent that189
any information contained in the statement or documentation includes information subject190
to attorney-client privilege or work-product protection, the disclosure does not constitute191
a waiver of the privilege or protection.192
(h)  A developer's compliance with this Code section shall not constitute a defense in a civil193
or administrative action regarding claims that the developer violated any other provision194
of this chapter or any other law.195
S. B. 167
- 8 - 25 LC 56 0255
10-16-3.196
(a)  No deployer of an automated decision system shall use an automated decision system197
in a manner that results in algorithmic discrimination.198
(b)  Except as provided in Code Section 10-16-6, a deployer of an automated decision199
system shall implement a risk management policy and program to govern the deployer's200
deployment of the automated decision system.  The risk management policy and program201
shall specify and incorporate the principles, processes, and personnel that the deployer uses202
to identify, document, and mitigate known or reasonably foreseeable risks of algorithmic203
discrimination.  The risk management policy and program shall be an iterative process204
planned, implemented, and regularly and systematically reviewed and updated over the life205
cycle of an automated decision system, requiring regular, systematic review and updates. 206
A risk management policy and program implemented and maintained pursuant to this207
subsection shall take into consideration:208
(1)  Either:209
(A) The guidance and standards set forth in the latest version of the Artificial210
Intelligence Risk Management Framework published by the National Institute of211
Standards and Technology of the United States Department of Commerce, standard212
ISO/IEC 42001 of the International Organization for Standardization, or another213
nationally or internationally recognized risk management framework for artificial214
intelligence systems, if the standards are substantially equivalent to or more stringent215
than the requirements of this chapter; or216
(B) Any risk management framework for artificial intelligence systems that the217
Attorney General, in the Attorney General's discretion, may designate;218
(2)  The size and complexity of the deployer;219
(3)  The nature and scope of the automated decision systems deployed by the deployer,220
including the intended uses of the automated decision systems; and221
S. B. 167
- 9 - 25 LC 56 0255
(4)  The sensitivity and volume of data processed in connection with the automated222
decision systems deployed by the deployer.223
(c)  A risk management policy and program implemented pursuant to this Code section224
may cover multiple automated decision systems deployed by the deployer.225
(d)  Each deployer shall establish and adhere to:226
(1)  Written standards, policies, procedures, and protocols for the acquisition, use of, or227
reliance on automated decision systems developed by third-party developers, including228
reasonable contractual controls ensuring that the developer statements and summaries229
described in subsection (b) of Code Section 10-16-2 include all information necessary for230
the deployer to fulfill its obligations under this Code section;231
(2) Procedures for reporting any incorrect information or evidence of algorithmic232
discrimination to a developer for further investigation and mitigation, as necessary; and233
(3)  Procedures to remediate and eliminate incorrect information from its automated234
decision systems that the deployer has identified or has been reported to a developer.235
(e)  Except as otherwise provided for in this chapter:236
(1)  A deployer, or a third party contracted by the deployer, that deploys an automated237
decision system shall complete an impact assessment for the automated decision system;238
and239
(2)  A deployer, or a third party contracted by the deployer, shall complete an impact240
assessment for a deployed automated decision system at least annually and within 90241
days after any intentional and substantial modification to the automated decision system242
is made available.243
(f)  An impact assessment completed pursuant to subsection (e) of this Code section shall244
include, at a minimum, and to the extent reasonably known by or available to the deployer:245
(1) A statement by the deployer disclosing the purpose, intended use cases, and246
deployment context of, and benefits afforded by, the automated decision system;247
S. B. 167
- 10 - 25 LC 56 0255
(2)  An analysis of whether the deployment of the automated decision system poses any248
known or reasonably foreseeable risks of:249
(A)  Algorithmic discrimination and, if so, the nature of the algorithmic discrimination250
and the steps that have been taken to mitigate the risks;251
(B)  Limits on accessibility for individuals who are pregnant, breastfeeding, or disabled,252
and, if so, what reasonable accommodations the deployer may provide that would253
mitigate any such limitations on accessibility;254
(C)  Any violation of state or federal labor laws, including laws pertaining to wages,255
occupational health and safety, and the right to organize; or256
(D)  Any physical or other intrusion upon the solitude or seclusion, or the private affairs257
or concerns, of consumers if such intrusion:258
(i)  Would be offensive to a reasonable person; and259
(ii)  May be redressed under the laws of this state;260
(3)  A description of the categories of data the automated decision system processes as261
inputs and the outputs the automated decision system produces;262
(4)  If the deployer used data to customize the automated decision system, an overview263
of the categories of data the deployer used to customize the automated decision system;264
(5)  An analysis of the automated decision system's validity and reliability in accordance265
with contemporary social science standards, and a description of any metrics used to266
evaluate the performance and known limitations of the automated decision system;267
(6)  A description of any transparency measures taken concerning the automated decision268
system, including any measures taken to disclose to a consumer that the automated269
decision system is in use when the automated decision system is in use;270
(7)  A description of the post-deployment monitoring and user safeguards provided271
concerning the automated decision system, including the oversight, use, and learning272
process established by the deployer to address issues arising from the deployment of the273
automated decision system; and274
S. B. 167
- 11 - 25 LC 56 0255
(8)  When such impact assessment is completed following an intentional and substantial275
modification to an automated decision system, a statement disclosing the extent to which276
the automated decision system was used in a manner that was consistent with, or varied277
from, the developer's intended uses of the automated decision system.278
(g)  If the analysis required by paragraph (2) of subsection (f) of this Code section reveals279
a risk of algorithmic discrimination, the deployer shall not deploy the automated decision280
system until the developer or deployer takes reasonable steps to search for and implement281
less discriminatory alternative decision methods.282
(h)  A single impact assessment may address a comparable set of automated decision283
systems deployed by a deployer.284
(i) If a deployer, or a third party contracted by the deployer, completes an impact285
assessment for the purpose of complying with another applicable law or regulation, the286
impact assessment shall satisfy the requirements established in this Code section if the287
impact assessment is reasonably similar in scope and effect to the impact assessment that288
would otherwise be completed pursuant to this Code section.289
(j)  A deployer shall maintain the most recently completed impact assessment for an290
automated decision system, all records concerning each impact assessment, and all prior291
impact assessments, if any, throughout the period of time that the automated decision292
system is deployed and for at least three years following the final deployment of the293
automated decision system.294
(k)  At least annually a deployer, or a third party contracted by the deployer, shall review295
the deployment of each automated decision system deployed by the deployer to ensure that296
the automated decision system is not causing algorithmic discrimination.297
(l)  Deployers shall publish on their public websites all impact assessments completed298
within the preceding three years in a form and manner prescribed by the Attorney General.299
S. B. 167
- 12 - 25 LC 56 0255
10-16-4.300
(a)  No later than the time that a deployer deploys an automated decision system to make,301
or assist in making, a consequential decision concerning a consumer, the deployer shall:302
(1)  Notify the consumer that the deployer has deployed an automated decision system303
to make, or assist in making, a consequential decision; and304
(2)  Provide to the consumer:305
(A)  A statement disclosing the purpose of the automated decision system and the306
nature of the consequential decision;307
(B)  The contact information for the deployer;308
(C) A description, in plain language, of the automated decision system, which309
description shall, at a minimum, include:310
(i)  A description of the personal characteristics or attributes that the system will311
measure or assess;312
(ii) The method by which the system measures or assesses those attributes or313
characteristics;314
(iii)  How those attributes or characteristics are relevant to the consequential decisions315
for which the system should be used;316
(iv)  Any human components of such system;317
(v) How any automated components of such system are used to inform such318
consequential decision; and319
(vi)  A direct link to a publicly accessible page on the deployer's public website that320
contains a plain-language description of the logic used in the system, including the321
key parameters that affect the output of the system; the system's outputs; the types and322
sources of data collected from natural persons and processed by the system when it323
is used to make, or assists in making, a consequential decision; and the results of the324
most recent impact assessment, or an active link to a web page where a consumer can325
review those results; and326
S. B. 167
- 13 - 25 LC 56 0255
(D)  Instructions on how to access the statement required by Code Section 10-16-5.327
(b)  A deployer that has used an automated decision system to make, or assist in making,328
a consequential decision concerning a consumer shall transmit to such consumer within one329
business day after such decision a notice that includes:330
(1)  A specific and accurate explanation that identifies the principal factors and variables331
that led to the consequential decision, including:332
(A) The degree to which, and manner in which, the automated decision system333
contributed to the consequential decision;334
(B)  The source or sources of the data processed by the automated decision system; and335
(C)  A plain-language explanation of how the consumer's personal data informed these336
principal factors and variables when the automated decision system made, or assisted337
in making, the consequential decision;338
(2)  Information about consumers' right to correct, and how the consumer can submit339
corrections and provide supplementary information relevant to, the consequential340
decision;341
(3)  What actions, if any, the consumer might have taken to secure a different decision342
and the actions that the consumer might take to secure a different decision in the future;343
(4)  Information on opportunities to correct any incorrect personal data that the automated344
decision system processed in making, or assisting in making, the consequential decision;345
and346
(5)  Information on opportunities to appeal an adverse consequential decision concerning347
the consumer arising from the deployment of an automated decision system, which348
appeal shall, if technically feasible, allow for human review.349
(c)(1) A deployer shall provide the notice, statement, contact information, and350
description required by subsections (a) and (b) of this Code section:351
(A)  Directly to the consumer;352
(B)  In plain language;353
S. B. 167
- 14 - 25 LC 56 0255
(C)  In all languages in which the deployer, in the ordinary course of the deployer's354
business, provides contracts, disclaimers, sale announcements, and other information355
to consumers; and356
(D)  In a format that is accessible to consumers with disabilities.357
(2)  If the deployer is unable to provide the notice, statement, contact information, and358
description directly to the consumer, the deployer shall make such information available359
in a manner that is reasonably calculated to ensure that the consumer receives it.360
(d)  No deployer shall use an automated decision system to make, or assist in making, a361
consequential decision if it cannot provide notices and explanations that satisfy the362
requirements of this Code section.363
10-16-5.364
(a)  Except as provided in Code Section 10-16-6, a deployer shall make available, in a365
manner that is clear and readily available on the deployer's public website, a statement366
summarizing:367
(1)  The types of automated decision systems that are currently deployed by the deployer;368
(2)  How the deployer manages known or reasonably foreseeable risks of algorithmic369
discrimination that may arise from the deployment of each such automated decision370
system; and371
(3)  In detail, the nature, source, and extent of the information collected and used by the372
deployer.373
(b)  A deployer shall periodically update the statement described in subsection (a) of this374
Code section.375
10-16-6.376
The provisions of subsections (b) and (e) of Code Section 10-16-3 shall not apply to a377
deployer when:378
S. B. 167
- 15 - 25 LC 56 0255
(1)  The automated decision system is used to make, or is a contributing factor in making,379
consequential decisions about fewer than 1,000 consumers in the preceding calendar year;380
and381
(2)  At the time the deployer deploys the automated decision system and at all times while382
the automated decision system is deployed:383
(A)  The deployer employs fewer than 15 full-time equivalent employees;384
(B)  The deployer does not use the deployer's own data to train the automated decision385
system;386
(C)  The automated decision system is used for the intended uses that are disclosed to387
the deployer as required by subsection (b) of Code Section 10-16-2;388
(D)  The automated decision system continues learning based on data derived from389
sources other than the deployer's own data;390
(E) The deployer makes available to consumers any impact assessment that the391
developer of the automated decision system has completed and provided to the392
deployer; and393
(F)  The deployer makes available to consumers any impact assessment that includes394
information that is substantially similar to the information in the impact assessment395
required under subsection (f) of Code Section 10-16-3.396
10-16-7.397
If a deployer deploys an automated decision system and subsequently discovers that the398
automated decision system has caused algorithmic discrimination, the deployer, without399
unreasonable delay, but no later than 90 days after the date of the discovery, shall send to400
the Attorney General, in a form and manner prescribed by the Attorney General, a notice401
disclosing the discovery.402
S. B. 167
- 16 - 25 LC 56 0255
10-16-8.403
A deployer who discloses information to the Attorney General, to a consumer, or to the404
general public pursuant to this chapter may make reasonable redactions for the purpose of405
protecting trade secrets.  To the extent that a deployer redacts or withholds information406
pursuant to this Code section, the deployer shall notify the consumer and provide a basis407
for the redaction or withholding.  Such notification shall comply with the requirements of408
subsection (c) of Code Section 10-16-4.409
10-16-9.410
The Attorney General may require that a deployer, or a third party contracted by the411
deployer, disclose to the Attorney General, no later than seven days after and in a form and412
manner prescribed by the Attorney General, any documentation or records required by this413
chapter. The Attorney General may evaluate the risk management policy, impact414
assessment, or records to ensure compliance with this chapter, and the risk management415
policy, impact assessment, and such records, notwithstanding the provisions of Article 4416
of Chapter 18 of Title 50, relating to open records, shall not be open to inspection by or417
made available to the public.  In a disclosure pursuant to this Code section, a deployer may418
designate the statement or documentation as including proprietary information or a trade419
secret.  To the extent that any information contained in the risk management policy, impact420
assessment, or records is subject to attorney-client privilege or work-product protection,421
the disclosure does not constitute a waiver of the privilege or protection.422
10-16-10.423
A deployer's compliance with the provisions of this chapter shall not constitute a defense424
in a civil or administrative action regarding claims that the deployer violated any other425
provision of this chapter or any other law.426
S. B. 167
- 17 - 25 LC 56 0255
10-16-11.427
(a)  Except as provided in subsection (b) of this Code section, a deployer or other developer428
that deploys, offers, sells, leases, licenses, gives, or otherwise makes available an artificial429
intelligence system that is intended to interact with consumers shall ensure the disclosure430
to each consumer who interacts with the artificial intelligence system that the consumer is431
interacting with an artificial intelligence system.432
(b) Disclosure is not required under subsection (a) of this Code section under433
circumstances in which it would be obvious to a reasonable person that the person is434
interacting with an artificial intelligence system.435
10-16-12.436
(a)  Nothing in this chapter shall be construed to restrict a developer's, a deployer's, or other437
person's ability to:438
(1)  Comply with federal, state, or municipal laws, ordinances, or regulations;439
(2)  Comply with a civil, criminal, or regulatory inquiry, investigation, subpoena, or440
summons by a federal, state, municipal, or other governmental authority;441
(3)  Cooperate with a law enforcement agency concerning conduct or activity that the442
developer, deployer, or other person reasonably and in good faith believes may violate443
federal, state, or municipal laws, ordinances, or regulations;444
(4)  Comply with the rules of evidence in an ongoing court proceeding;445
(5)  Take immediate steps to protect an interest that is essential for the life or physical446
safety of a consumer or another individual;447
(6) Conduct research, testing, and development activities regarding an artificial448
intelligence system or model, other than testing conducted under real-world conditions,449
before the artificial intelligence system or model is used to make, or assist in making, a450
consequential decision, or is otherwise placed on the market, deployed, or put into451
service, as applicable;452
S. B. 167
- 18 - 25 LC 56 0255
(7)  Effectuate a product recall; or453
(8)  Assist another developer, deployer, or person with any of the obligations imposed454
under this chapter.455
(b)  Nothing in this chapter applies to any artificial intelligence system that is acquired by456
or for the federal government or any federal agency or department, including the United457
States Department of Commerce, the United States Department of Defense, or the National458
Aeronautics and Space Administration, unless the artificial intelligence system is an459
automated decision system that is used to make, or assist in making, a decision concerning460
employment or housing.461
(c) If a developer, a deployer, or other person engages in an action pursuant to an462
exemption set forth in this Code section, such developer, deployer, or other person bears463
the burden of demonstrating that the action qualifies for the exemption.464
(d)  If a developer or deployer withholds information pursuant to an exemption set forth465
in this Code section for which disclosure would otherwise be required by this chapter, such466
developer or deployer shall notify the subject of disclosure and provide a basis for467
withholding the information.  Such notification shall comply with the requirements of468
subsection (c) of Code Section 10-16-4.469
10-16-13.470
(a)  A violation of the requirements established in this chapter shall be enforceable through471
the provisions of Part 2 of Article 15 of Chapter 1 of this title, the 'Fair Business Practices472
Act of 1975.'473
(b)  In any action commenced by the Attorney General to enforce this chapter, it is an474
affirmative defense that the developer, deployer, or other person:475
(1)  Discovers a violation of this chapter as a result of:476
(A)  Adversarial testing or red teaming, as those terms are defined or used by the477
National Institute of Standards and Technology; or478
S. B. 167
- 19 - 25 LC 56 0255
(B)  An internal review process;479
(2)  Cures the violation within seven days and reports the violation to the Attorney480
General and any affected consumers;481
(3)  Is otherwise in compliance with the provisions of this chapter and:482
(A)  The latest version of the Artificial Intelligence Risk Management Framework483
published by the National Institute of Standards and Technology of the United States484
Department of Commerce and standard ISO/IEC 42001 of the International485
Organization for Standardization;486
(B)  Another nationally or internationally recognized risk management framework for487
artificial intelligence systems, if the standards are substantially equivalent to or more488
stringent than the requirements of this chapter; or489
(C) Any risk management framework for artificial intelligence systems that the490
Attorney General, in the Attorney General's discretion, may designate and, if491
designated, shall publicly disseminate; and492
(4)  Demonstrates that the violation was inadvertent, affected fewer than 100 consumers,493
and could not have been discovered through reasonable diligence.494
(c)  A developer, deployer, or other person bears the burden of demonstrating to the495
Attorney General that the requirements of subsection (b) of this Code section have been496
satisfied.497
(d)  Nothing in this chapter, including the enforcement authority granted to the Attorney498
General under this Code section, preempts or otherwise affects any right, claim, remedy,499
presumption, or defense available at law or in equity. A rebuttable presumption or500
affirmative defense established under this chapter applies only to an enforcement action501
brought by the Attorney General pursuant to this Code section and does not apply to any502
right, claim, remedy, presumption, or defense available at law or in equity.503
S. B. 167
- 20 - 25 LC 56 0255
10-16-14.504
The Attorney General may promulgate rules as necessary for the purpose of implementing505
and enforcing this chapter.506
10-16-15.507
This chapter is declared to be remedial, with the purposes of protecting consumers and508
ensuring consumers receive information about consequential decisions affecting them.  The509
provisions of this chapter granting rights or protections to consumers shall be construed510
broadly and exemptions construed narrowly."511
SECTION 2.512
All laws and parts of laws in conflict with this Act are repealed.513
S. B. 167
- 21 -