GENERAL ASSEMBLY OF NORTH CAROLINA SESSION 2025 S 1 SENATE BILL 735 Short Title: AI Innovation Trust Fund. (Public) Sponsors: Senators Salvador, Garrett, and Murdock (Primary Sponsors). Referred to: Rules and Operations of the Senate March 26, 2025 *S735 -v-1* A BILL TO BE ENTITLED 1 AN ACT TO ENACT THE ARTIFICIAL INTELLIGENCE INNOVATION TRUST FUND. 2 Whereas, recognizing the rapidly evolving nature of artificial intelligence and the 3 importance of responsible innovation, the General Assembly intends this Act to establish an 4 exploratory, iterative approach to AI governance, inviting stakeholder input and encouraging 5 collaborative development of appropriate and proportionate AI regulations; Now, therefore, 6 The General Assembly of North Carolina enacts: 7 SECTION 1. Article 10 of Chapter 143B of the General Statutes is amended by 8 adding a new Part to read: 9 "Part 18A. Artificial Intelligence Innovation. 10 "§ 143B-472.83A. Artificial Intelligence Innovation Trust Fund. 11 (a) Fund. – There is established a special, nonreverting fund to be known as the North 12 Carolina Artificial Intelligence Innovation Trust Fund. The Secretary of Commerce shall be the 13 trustee of the fund and shall expend money from the fund to (i) provide grants or other financial 14 assistance to companies developing or deploying artificial intelligence models in key industry 15 sectors or (ii) establish or promote artificial intelligence entrepreneurship programs, which may 16 include partnerships with research institutions in the State or other entrepreneur support 17 organizations. The fund shall consist of appropriations to the Department of Commerce to be 18 allocated to the fund, interest earned on money in the fund, and any other grants, premiums, gifts, 19 reimbursements or other contributions received by the State from any source for or in support of 20 the purposes described in this subsection. Funds in the fund are hereby appropriated to the 21 Department for the purposes set forth in this section, and, except as otherwise expressly provided, 22 the provisions of this section apply to persons receiving a grant or assistance from the fund. Funds 23 provided under this Part shall not support projects involving artificial intelligence intended for 24 mass surveillance infringing constitutional rights, unlawful social scoring, discriminatory 25 profiling based on protected characteristics, or generating deceptive digital content intended for 26 fraudulent or electoral interference purposes. 27 (b) Definitions. – The following definitions apply in this section: 28 (1) Advanced persistent threat. – An adversary with sophisticated levels of 29 expertise and significant resources that allow it, through the use of multiple 30 different attack vectors including, but not limited to, cyber, physical or 31 deception, to generate opportunities to achieve objectives including, but not 32 limited to, (i) establishing or extending its presence within the information 33 technology infrastructure of an organization for the purpose of exfiltrating 34 information; (ii) undermining or impeding critical aspects of a mission, 35 General Assembly Of North Carolina Session 2025 Page 2 Senate Bill 735-First Edition program or organization; or (iii) placing itself in a position to do so in the 1 future. 2 (2) Artificial intelligence. – An engineered or machine-based system that varies 3 in its level of autonomy and which may, for explicit or implicit objectives, 4 infer from the input it receives how to generate outputs that may influence 5 physical or virtual environments. 6 (3) Artificial intelligence safety incident. – An incident that demonstrably 7 increases the risk of a critical harm occurring by means of any of the 8 following: 9 a. A covered model or covered model derivative autonomously engaging 10 in behavior other than at the request of a user. 11 b. Theft, misappropriation, malicious use, inadvertent release, 12 unauthorized access or escape of the model weights of a covered 13 model or covered model derivative. 14 c. The critical failure of technical or administrative controls, including 15 controls limiting the ability to modify a covered model or covered 16 model derivative. 17 d. Unauthorized use of a covered model or covered model derivative to 18 cause or materially enable critical harm. 19 (4) Computing cluster. – A set of machines transitively connected by data center 20 networking of over 100 gigabits per second that has a theoretical maximum 21 computing capacity of at least 10 to the power of 20 integer or floating-point 22 operations per second and can be used for training artificial intelligence. 23 (4a) Covered entity. – The legally responsible organization, corporation, or entity 24 that directly oversees and controls the development, deployment, and ongoing 25 operations of a covered model or covered model derivative, including 26 responsibility for compliance with obligations under this Part 27 (5) Covered model. – An artificial intelligence model that, due to its scale, 28 application domain, or potential impact, is identified by the Secretary as 29 warranting proportionate regulatory oversight. Factors considered may 30 include, but are not limited to, computing power utilized, model training cost, 31 anticipated scope of application, and foreseeable risks to public safety or 32 individual rights. The Secretary may establish multiple tiers of covered 33 models with corresponding compliance frameworks scaled proportionately to 34 identified risk levels. 35 (6) Covered model derivative. – A copy of a covered model that: (i) is 36 unmodified; (ii) has been subjected to post-training modifications related to 37 fine-tuning; (iii) has been fine-tuned using a quantity of computing power not 38 exceeding 3 times 10 to the power of 25 or floating point operations, the cost 39 of which, as reasonably assessed by the developer, exceeds $10,000,000 if 40 calculated using the average market price of cloud compute at the start of 41 fine-tuning; or (iv) has been combined with other software. 42 (7) Critical harm. – A harm caused or materially enabled by a covered model or 43 covered model derivative including: (i) the creation or use in a manner that 44 results in mass casualties of a chemical, biological, radiological or nuclear 45 weapon; (ii) mass casualties or at least $500,000,000 of damage resulting from 46 cyberattacks on critical infrastructure by a model conducting, or providing 47 precise instructions for conducting, a cyberattack or series of cyberattacks on 48 critical infrastructure; (iii) mass casualties or at least $500,000,000 of damage 49 resulting from an artificial intelligence model engaging in conduct that acts 50 with limited human oversight, intervention or supervision and results in death, 51 General Assembly Of North Carolina Session 2025 Senate Bill 735-First Edition Page 3 great bodily injury, property damage or property loss, and would, if committed 1 by a human, constitute a crime specified in any general or special law that 2 requires intent, recklessness or gross negligence, or the solicitation or aiding 3 and abetting of such a crime; or (iv) other grave harms to public safety that 4 are of comparable severity to the harms described herein as determined by the 5 attorney general. 6 The term does not include harms caused or materially enabled by information 7 that a covered model or covered model derivative outputs if the information 8 is otherwise reasonably publicly accessible by an ordinary person from 9 sources other than a covered model or covered model derivative; (ii) harms 10 caused or materially enabled by a covered model combined with other 11 software, including other models, if the covered model did not materially 12 contribute to the other software's ability to cause or materially enable the 13 harm; or (iii) harms that are not caused or materially enabled by the 14 developer's creation, storage, use or release of a covered model or covered 15 model derivative; provided further, that monetary harm thresholds established 16 pursuant to this section shall be adjusted for inflation annually, not later than 17 January 31, by the growth rate of the inflation index over the preceding 12 18 months; and provided further, that the inflation index shall consist of the per 19 cent change in inflation as measured by the per cent change in the consumer 20 price index for all urban consumers for the Raleigh metropolitan area as 21 determined by the bureau of labor statistics of the United States Department 22 of Labor. 23 (8) Critical infrastructure. – Assets, systems and networks, whether physical or 24 virtual, the incapacitation or destruction of which would have a debilitating 25 effect on physical security, economic security, public health or safety in the 26 State. 27 (8a) Department. – The Department of Commerce. 28 (9) Developer. – A person that performs the initial training of a covered model 29 by: (i) training a model using a sufficient quantity of computing power and 30 cost; or (ii) fine-tuning an existing covered model or covered model derivative 31 using a quantity of computing power and cost sufficient to qualify as a covered 32 model. 33 (10) Fine-tuning. – Adjusting the model weights of a trained covered model or 34 covered model derivative by exposing such model to additional data. 35 (11) Full shutdown. – The cessation of operation of: (i) the training of a covered 36 model; (ii) a covered model controlled by a developer; and (iii) all covered 37 model derivatives controlled by a developer. 38 (11a) Fund. – The Artificial Intelligence Innovation Trust Fund, as established in 39 this section. 40 (12) Model weight. – A numerical parameter in an artificial intelligence model that 41 is adjusted through training and that helps determine how inputs are 42 transformed into outputs. 43 (13) Person. – An individual, proprietorship, firm, partnership, joint venture, 44 syndicate, business trust, company, corporation, limited liability company, 45 association, committee or any other nongovernmental organization or group 46 of persons acting in concert. 47 (14) Post-training modification. – Modifying the capabilities of a covered model 48 or covered model derivative by any means including, but not limited to, 49 fine-tuning, providing such model with access to tools or data, removing 50 General Assembly Of North Carolina Session 2025 Page 4 Senate Bill 735-First Edition safeguards against hazardous misuse or misbehavior of such model or 1 combining such model with, or integrating such model into, other software. 2 (15) Safety and security protocol. – Documented, technical, and organizational 3 protocols that: (i) are used to manage the risks of developing and operating 4 covered models or covered model derivatives across their life cycle, including 5 risks posed by causing or enabling or potentially causing or enabling the 6 creation of covered model derivatives; and (ii) specify that compliance with 7 such protocols is required in order to train, operate, possess or provide external 8 access to the developer's covered model or covered model derivatives. 9 (16) Secretary. – The Secretary of Commerce. 10 (c) Oversight. – The Secretary may convene an AI Innovation and Safety Advisory Panel 11 composed of representatives from industry, academia, civil liberties and consumer advocacy 12 groups, and relevant state agencies. This Panel may provide recommendations, best practices, 13 and advice regarding AI technologies, compliance proportionality, and ethical AI-human 14 collaboration. Recommendations of this Panel shall be publicly accessible and may inform future 15 regulatory proposals. 16 (d) Standards. – The Secretary may consider relevant provisions, guidelines, frameworks, 17 and standards established by the U.S. National Institute of Standards and Technology (NIST), 18 and comparable frameworks, such as the EU AI Act, when developing proposals and 19 recommendations pursuant to this Part. 20 "§ 143B-472.83B. Requirements for developers of covered models. 21 (a) Reserved. 22 (b) Reserved. 23 (c) Before beginning to train a covered model, a developer shall do all of the following: 24 (1) Implement reasonable administrative, technical and physical cybersecurity 25 protections to prevent unauthorized access to, misuse of or unsafe 26 post-training modifications of the covered model and all covered model 27 derivatives controlled by the developer that are appropriate in light of the risks 28 associated with the covered model, including from advanced persistent threats 29 or other sophisticated actors. 30 (2) Implement the capability to promptly enact a full shutdown. 31 (3) Implement a written and separate safety and security protocol that: (i) 32 specifies protections and procedures that, if successfully implemented, would 33 comply with the developer's duty to take reasonable care to avoid producing 34 a covered model or covered model derivative that poses an unreasonable risk 35 of causing or materially enabling a critical harm; (ii) states compliance 36 requirements in an objective manner and with sufficient detail and specificity 37 to allow the developer or a third party to readily ascertain whether the 38 requirements of the safety and security protocol have been followed; (iii) 39 identifies a testing procedure which takes safeguards into account as 40 appropriate to reasonably evaluate if a covered model poses a substantial risk 41 of causing or enabling a critical harm and if any covered model derivatives 42 pose a substantial risk of causing or enabling a critical harm; (iv) describes in 43 detail how the testing procedure assesses the risks associated with 44 post-training modifications; (v) describes in detail how the testing procedure 45 addresses the possibility that a covered model or covered model derivative 46 may be used to make post-training modifications or create another covered 47 model in a manner that may cause or materially enable a critical harm; (vi) 48 describes in detail how the developer will fulfill their obligations under this 49 chapter; (vii) describes in detail how the developer intends to implement any 50 safeguards and requirements referenced in this section; (viii) describes in 51 General Assembly Of North Carolina Session 2025 Senate Bill 735-First Edition Page 5 detail the conditions under which a developer would enact a full shutdown 1 account for, as appropriate, the risk that a shutdown of the covered model, or 2 particular covered model derivatives, may cause disruptions to critical 3 infrastructure; and (ix) describes in detail the procedure by which the safety 4 and security protocol may be modified. 5 (4) Ensure that the safety and security protocol is implemented as written, 6 including by designating senior personnel to be responsible for ensuring 7 compliance by employees and contractors working on a covered model or any 8 covered model derivatives controlled by the developer, monitoring and 9 reporting on implementation. 10 (5) Retain an unredacted copy of the safety and security protocol for not less than 11 five years after the covered model is no longer made available for commercial, 12 public or foreseeably public use,, including records and dates of any updates 13 or revisions. 14 (6) Conduct an annual review of the safety and security protocol to account for 15 any changes to the capabilities of the covered model and industry best 16 practices and, if necessary, make modifications to such policy. 17 (7) Conspicuously publish a redacted copy of the safety and security protocol and 18 transmit a copy of said redacted safety and security protocol to the attorney 19 general; provided, however, that (i) a redaction in the safety and security 20 protocol may be made only if the redaction is reasonably necessary to protect 21 public safety, trade secrets, or confidential information pursuant to any 22 general, special, or federal law; (ii) the developer shall grant to the attorney 23 general access to the unredacted safety and security protocol upon request; 24 (iii) a safety and security protocol disclosed to the attorney general shall not 25 be a public record; and (iv) if the safety and security protocol is materially 26 modified, the developer shall conspicuously publish and transmit to the 27 attorney general an updated redacted copy of such protocol within 30 days of 28 the modification. 29 (8) Take reasonable care to implement other appropriate measures to prevent 30 covered models and covered model derivatives from posing unreasonable 31 risks of causing or materially enabling critical harms. 32 (d) Before using a covered model or covered model derivative for a purpose not 33 exclusively related to the training or reasonable evaluation of the covered model for compliance 34 with State or federal law or before making a covered model or covered model derivative available 35 for commercial, public or foreseeably public use, the developer of a covered model shall do all 36 of the following: 37 (1) Assess whether the covered model is reasonably capable of causing or 38 materially enabling a critical harm. 39 (2) Record, as and when reasonably possible, and retain for not less than five 40 years after the covered model is no longer made available for commercial, 41 public or foreseeably public use, information on any specific tests and test 42 results used in said assessment which provides sufficient detail for third 43 parties to replicate the testing procedure. 44 (3) Take reasonable care to implement appropriate safeguards to prevent the 45 covered model and covered model derivatives from causing or materially 46 enabling a critical harm. 47 (4) Take reasonable care to ensure, to the extent reasonably possible, that the 48 covered model's actions and the actions of covered model derivatives, as well 49 as critical harms resulting from their actions, may be accurately and reliably 50 attributed to such model or model derivative. 51 General Assembly Of North Carolina Session 2025 Page 6 Senate Bill 735-First Edition (e) A developer shall not use a covered model or covered model derivative for a purpose 1 not exclusively related to the training or reasonable evaluation of the covered model for 2 compliance with State or federal law or make a covered model or a covered model derivative 3 available for commercial, public or foreseeably public use if there is an unreasonable risk that 4 the covered model or covered model derivative will cause or materially enable a critical harm. 5 (f) A developer of a covered model shall annually reevaluate the procedures, policies, 6 protections, capabilities and safeguards implemented pursuant to this section. 7 (g) A developer of a covered model shall annually retain a third-party that conducts 8 investigations consistent with best practices for investigators to perform an independent 9 investigation of compliance with the requirements of this section. 10 (1) The investigator shall conduct investigations consistent with regulations 11 issued by the Secretary. The investigator shall be granted access to unredacted 12 materials as necessary to comply with the investigator's obligations contained 13 herein. The investigator shall produce an investigation report including, but 14 not limited to: (i) a detailed assessment of the developer's steps to comply with 15 the requirements of this section; (ii) if applicable, any identified instances of 16 noncompliance with the requirements of this section and any 17 recommendations for how the developer can improve its policies and 18 processes for ensuring compliance with the requirements of this section; (iii) 19 a detailed assessment of the developer's internal controls, including 20 designation and empowerment of senior personnel responsible for ensuring 21 compliance by the developer and any employees or contractors thereof; and 22 (iv) the signature of the lead investigator certifying the results contained 23 within the investigation report; and provided further, that the investigator shall 24 not knowingly make a material misrepresentation in said report. 25 (2) Covered entities shall transmit to the Attorney General a confidential copy of 26 any independent investigator's report conducted under this section. An 27 executive summary outlining compliance status and risk mitigation actions 28 shall be made publicly available, with proprietary, sensitive, or 29 security-related information redacted as necessary. 30 (h) A developer of a covered model shall annually, until such time that the covered model 31 and any covered model derivatives controlled by the developer cease to be in or available for 32 commercial or public use, submit to the attorney general a statement of compliance signed by the 33 developer's chief technology officer, or a more senior corporate officer, that shall specify or 34 provide, at a minimum: (i) an assessment of the nature and magnitude of critical harms that the 35 covered model or covered model derivatives may reasonably cause or materially enable and the 36 outcome of the assessment required by this section; (ii) an assessment of the risk that compliance 37 with the safety and security protocol may be insufficient to prevent the covered model or covered 38 model derivatives from causing or materially enabling critical harms; and (iii) a description of 39 the process used by the signing officer to verify compliance with the requirements of this section, 40 including a description of the materials reviewed by the signing officer, a description of testing 41 or other evaluation performed to support the statement and the contact information of any third 42 parties relied upon to validate compliance. 43 A developer shall submit such statement to the attorney general not later than 30 days after 44 using a covered model or covered model derivative for a purpose not exclusively related to the 45 training or reasonable evaluation of the covered model for compliance with State or federal law 46 or making a covered model or covered model derivative available for commercial, public or 47 foreseeably public use; provided, however, that no such initial statement shall be required for a 48 covered model derivative if the developer submitted a compliant initial statement and any 49 applicable annual statements for the covered model from which the covered model derivative is 50 derived. 51 General Assembly Of North Carolina Session 2025 Senate Bill 735-First Edition Page 7 (i) A developer of a covered model shall report each artificial intelligence safety incident 1 affecting the covered model or any covered model derivatives controlled by the developer to the 2 attorney general within 72 hours of the developer learning of the artificial intelligence safety 3 incident or facts sufficient to establish a reasonable belief that an artificial intelligence safety 4 incident has occurred. 5 (j) This section shall apply to the development, use or commercial or public release of a 6 covered model or covered model derivative for any use that is not the subject of a contract with 7 a federal government entity, even if that covered model or covered model derivative was 8 developed, trained or used by a federal government entity; provided, however, that this section 9 shall not apply to a product or service to the extent that compliance would strictly conflict with 10 the terms of a contract between a federal government entity and the developer of a covered model. 11 (k) The Secretary may develop and propose a tiered compliance framework 12 differentiating obligations based on computing scale, intended applications, societal impact, and 13 organizational size. This framework shall be developed through stakeholder consultations and 14 presented to the General Assembly with recommendations for potential adoption. 15 (l) A developer or covered entity may remain responsible for foreseeable critical harms 16 arising from misuse or unintended use of a covered model or derivative, irrespective of whether 17 such misuse involved fine-tuning. Covered entities may conduct and document pre-deployment 18 risk assessments to identify and reasonably mitigate foreseeable misuse risks. 19 (m) Covered entities funded under this Act developing AI systems that significantly 20 impact individuals' rights or access to critical services such as employment, housing, education, 21 or financial products may conduct exploratory algorithmic fairness assessments to detect and 22 mitigate potential bias. These assessments may be shared with stakeholders and the Department 23 to inform future policy development. 24 (n) Covered entities may voluntarily explore methods for disclosing to end-users when 25 they are interacting with an artificial intelligence system, particularly where the nature of 26 interaction is not immediately obvious. Such entities may also explore labeling content generated 27 by funded AI systems where there is potential for it to be mistaken for human-generated content. 28 Findings from these explorations may be reported to the Department to inform future 29 transparency guidelines. 30 "§ 143B-472.83C. Requirements for computer resource operators training covered models. 31 (a) A person that operates a computing cluster shall implement written policies and 32 procedures to do all of the following when a customer utilizes computer resources which would 33 be sufficient to train a covered model: 34 (1) Obtain the prospective customer's basic identifying information and business 35 purpose for utilizing the computing cluster including, but not limited to: (i) 36 the identity of the prospective customer; (ii) the means and source of payment, 37 including any associated financial institution, credit card number, account 38 number, customer identifier, transaction identifiers or virtual currency wallet 39 or wallet address identifier; and (iii) the email address and telephone number 40 used to verify the prospective customer's identity. 41 (2) Assess whether the prospective customer intends to utilize the computing 42 cluster to train a covered model. 43 (3) Maintain logs of significant access and administrative actions consistent with 44 commercially reasonable cybersecurity practices. 45 (4) Maintain for not less than seven years, and provide to the attorney general 46 upon request, appropriate records of actions taken under this section, 47 including policies and procedures put into effect. 48 (5) Implement the capability to promptly enact a full shutdown of any resources 49 being used to train or operate a covered model under the customer's control. 50 General Assembly Of North Carolina Session 2025 Page 8 Senate Bill 735-First Edition If a customer repeatedly utilizes computer resources that would be sufficient to train a 1 covered model, the operator of the computer cluster shall validate said basic identifying 2 information and assess whether such customer intends to utilize the computing cluster to train a 3 covered model prior to each utilization. 4 (b) A person that operates a computing cluster shall consider industry best practices and 5 applicable guidance from the National Institute of Standards and Technology, including the 6 United States Artificial Intelligence Safety Institute, and other reputable standard-setting 7 organizations. 8 (c) In complying with the requirements of this section, a person that operates a computing 9 cluster may impose reasonable requirements on customers to prevent the collection or retention 10 of personal information that the person operating such computing cluster would not otherwise 11 collect or retain, including a requirement that a corporate customer submit corporate contact 12 information rather than information that would identify a specific individual. 13 "§ 143B-472.83D. Enforcement. 14 (a) The attorney general shall have the authority to enforce the provisions of this Part. 15 Except as specifically provided in this Part, nothing in this Part shall be construed as creating a 16 new private right of action or serving as the basis for a private right of action that would not 17 otherwise have had a basis under any other law but for the enactment of this Part. This Part 18 neither relieves any party from any duties or obligations imposed nor alters any independent 19 rights that individuals have under State or federal laws, the North Carolina Constitution or the 20 United States Constitution. 21 The attorney general may initiate a civil action in the superior court against an entity in the 22 name of the State or on behalf of individuals for a violation of this chapter. The attorney general 23 may seek: 24 (1) Against a developer of a covered model or covered model derivative for a 25 violation that causes death or bodily harm to another human, harm to property, 26 theft or misappropriation of property, or that constitutes an imminent risk or 27 threat to public safety that occurs on or after January 1, 2026, a civil penalty 28 in an amount not exceeding (i) for a first violation, five percent (5%) of the 29 cost of the quantity of computing power used to train the covered model to be 30 calculated using the average market prices of cloud compute at the time of 31 training or (ii) for any subsequent violation, 15 percent (15%) of the cost of 32 the quantity of computing power used to train the covered model as calculated 33 herein. 34 (2) Against an investigator for a violation of this Part, including an investigator 35 who intentionally or with reckless disregard violates any of such investigator's 36 responsibilities , or for a person that operates a computing cluster in violation 37 of this Part, a civil penalty in an amount not exceeding (i) twenty-five 38 thousand dollars ($25,000) for a first offense; (ii) fifty thousand dollars 39 ($50,000) for any subsequent violation; and (iii) five million dollars 40 ($5,000,000) in the aggregate for related violations. 41 (3) Injunctive or declaratory relief. 42 (4) Such monetary or punitive damages as the court may allow. 43 (5) Attorney's fees and costs. 44 (6) Any other relief that the court deems appropriate. 45 (b) In determining whether a developer exercised reasonable care in the creation, use, or 46 deployment of a covered model or covered model derivative, the attorney general shall consider 47 all of the following: 48 (1) The quality of such developer's safety and security protocol. 49 (2) The extent to which the developer faithfully implemented and followed its 50 safety and security protocol. 51 General Assembly Of North Carolina Session 2025 Senate Bill 735-First Edition Page 9 (3) Whether, in quality and implementation, the developer's safety and security 1 protocol was comparable to those of developers of models trained using a 2 comparable amount of compute resources. 3 (4) The quality and rigor of the developer's investigation, documentation, 4 evaluation and management of risks of critical harm posed by its model. 5 (c) A provision within a contract or agreement that seeks to waive, preclude, or burden 6 the enforcement of liability arising from a violation of this Part, or to shift such liability to any 7 person or entity in exchange for their use or access of, or right to use or access, a developer's 8 product or services, including by means of a contract or adhesion, shall be deemed to be against 9 public policy and void. 10 Notwithstanding any corporate formalities, the court shall impose joint and several liability 11 on affiliated entities for purposes of effectuating the intent of this section to the maximum extent 12 permitted by law if the court concludes all of the following: 13 (1) The affiliated entities, in the development of the corporate structure among 14 such affiliated entities, took steps to purposely and unreasonably limit or avoid 15 liability. 16 (2) As a result of any such steps, the corporate structure of the developer or 17 affiliated entities would frustrate recovery of penalties, damages, or injunctive 18 relief under this section. 19 (d) Penalties collected pursuant to this section by the attorney general shall be deposited 20 into the General Fund and subject to appropriation. 21 "§ 143B-472.83E. Cooperation with Attorney General. 22 (a) For purposes of this section, the following definitions apply: 23 (1) Contractor or subcontractor. – A firm, corporation, partnership or association 24 and its responsible managing officer, as well as any supervisors, managers or 25 officers found by the attorney general or director to be personally and 26 substantially responsible for the rights and responsibilities of employees under 27 this section. 28 (2) Employee. – Any person who performs services for wages or salary under a 29 contract of employment, express or implied, for an employer, including: 30 a. Contractors or subcontractors and unpaid advisors involved with 31 assessing, managing or addressing the risk of critical harm from 32 covered models or covered model derivatives. 33 b. Corporate officers. 34 (b) A developer of a covered model or a contractor or subcontractor of the developer shall 35 not: 36 (1) Prevent an employee from disclosing information to the attorney general or 37 any other public body, including through terms and conditions of employment 38 or seeking to enforce terms and conditions of employment, if the employee 39 has reasonable cause to believe the information indicates that (i) the developer 40 is out of compliance with the requirements of this section or (ii) an artificial 41 intelligence model, including a model that is not a covered model or a covered 42 model derivative, poses an unreasonable risk of causing or materially enabling 43 critical harm, even if the employer is not out of compliance with any State or 44 federal law. 45 (2) Retaliate against an employee for disclosing such information to the attorney 46 general or any other public body. 47 (3) Make false or materially misleading statements related to its safety and 48 security protocol in any manner that would constitute an unfair or deceptive 49 trade practice. 50 General Assembly Of North Carolina Session 2025 Page 10 Senate Bill 735-First Edition (c) An employee harmed by a violation of this section may petition the court for 1 appropriate relief. 2 (d) The attorney general may publicly release any complaint, or a summary of such 3 complaint, filed pursuant to this section if the attorney general concludes that doing so will serve 4 the public interest; provided, however, that any information that is confidential, qualifies as a 5 trade secret, or is determined by the attorney general to likely pose an unreasonable risk to public 6 safety if disclosed shall be redacted from the complaint prior to disclosure. 7 (e) A developer shall provide a clear notice to all employees working on covered models 8 and covered model derivatives of their rights and responsibilities under this section, including 9 the rights of employees of contractors and subcontractors to utilize the developer's internal 10 process for making protected disclosures pursuant to subsection (f). A developer is presumed to 11 be in compliance with the requirements of this subsection if the developer: 12 (1) At all times posts and displays within all workplaces maintained by the 13 developer a notice to all employees of their rights and responsibilities under 14 this section, ensures that all new employees receive equivalent notice and 15 ensures that employees who work remotely periodically receive an equivalent 16 notice; or 17 (2) At least annually, provides written notice to all employees of their rights and 18 responsibilities under this section and ensures that such notice is received and 19 acknowledged by all of those employees. 20 (f) A developer shall provide a reasonable internal process through which an employee, 21 contractor, subcontractor or employee of a contractor or subcontractor working on a covered 22 model or covered model derivative may anonymously disclose information to the developer if 23 the employee believes, in good faith, that the developer has violated any provision of this chapter 24 or any other general or special law, has made false or materially misleading statements related to 25 its safety and security protocol or has failed to disclose known risks to employees. The developer 26 shall conduct an investigation related to any information disclosed through such process and 27 provide, at a minimum, a monthly update to the person who made the disclosure regarding the 28 status of the developer's investigation of the disclosure and the actions taken by the developer in 29 response to the disclosure. 30 Any disclosure and response created pursuant to this subsection shall be maintained for not 31 less than seven years from the date when the disclosure or response is created. Each disclosure 32 and response shall be shared with officers and directors of the developer whose acts or omissions 33 are not implicated by the disclosure or response not less than once per quarter. In the case of a 34 report or disclosure regarding alleged misconduct by a contractor or subcontractor, the developer 35 shall notify the officers and directors of the contractor or subcontractor whose acts or omissions 36 are not implicated by the disclosure or response about the status of their investigation not less 37 than once per quarter. 38 "§ 143B-472.83. Reporting and regulation. 39 The Secretary shall file an annual report not later than January 31 with the General Assembly 40 containing: (i) statistical information on the current workforce population in the business of the 41 development of artificial intelligence and in adjacent technology sectors; (ii) any known 42 workforce shortages in the development or deployment of artificial intelligence; (iii) summary 43 information related to the efficacy of existing workforce development programs in artificial 44 intelligence and related sectors, if any; (iv) summary information related to the availability of 45 relevant training programs available in the State, including any known gaps in such programs 46 generally available to members of the public; and (iv) any plans, including recommendations for 47 legislation, if any, to remedy any such known workforce shortages. 48 The Secretary shall promulgate regulations for the implementation, administration and 49 enforcement of this Part; provided, however, that the Secretary may convene an advisory board 50 for the purposes of: (i) studying the impact of artificial intelligence on the State, including with 51 General Assembly Of North Carolina Session 2025 Senate Bill 735-First Edition Page 11 respect to its employees, constituents, private business and higher education institutions; (ii) 1 conducting outreach and collecting input from stakeholders and experts; (iii) studying current 2 and emerging capability for critical harms made possible by artificial intelligence developed or 3 deployed in the State; or (iv) advising the Governor and General Assembly on recommended 4 legislation or regulations related to the growth of the artificial intelligence industry and 5 prevention of critical harms. 6 Not less than annually, the Secretary shall do all of the following: 7 (1) Update, by regulation, the initial compute threshold and the fine-tuning 8 compute threshold that an artificial intelligence model shall meet to be 9 considered a covered model, taking into account: (i) the quantity of computing 10 power used to train models that have been identified as being reasonably likely 11 to cause or materially enable a critical harm; (ii) similar thresholds used in 12 federal law, guidance or regulations for the management of artificial 13 intelligence models with reasonable risks of causing or enabling critical 14 harms; and (iii) input from stakeholders, including academics, industry, the 15 open-source community and government entities. 16 (2) Update, by regulation, binding investigation requirements applicable to 17 investigations conducted pursuant to this Part to ensure the integrity, 18 independence, efficiency and effectiveness of the investigation process, taking 19 into account: (i) relevant standards or requirements imposed under federal or 20 State law or through self-regulatory or standards-setting bodies; (ii) input from 21 stakeholders, including academic, industry and government entities, including 22 from the open-source community; and (iii) consistency with guidance issued 23 by the National Institute of Standards and Technology, including the United 24 States Artificial Intelligence Safety Institute. 25 (3) Issue guidance for preventing unreasonable risks of covered models and 26 covered model derivatives causing or materially enabling critical harms, 27 including, but not limited to, more specific components of, or requirements 28 under, the duties required under this Part. Such guidance shall be consistent 29 with guidance issued by the National Institute of Standards and Technology, 30 including the United States Artificial Intelligence Safety Institute." 31 SECTION 2. There is appropriated from the General Fund to the Department of 32 Commerce the nonrecurring sum of seven hundred fifty thousand dollars ($750,000) for the 33 2025-2026 fiscal year to accomplish the purposes of this act. 34 SECTION 3. This act becomes effective July 1, 2025. 35