CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Assembly Bill No. 1064Introduced by Assembly Member Bauer-KahanFebruary 20, 2025 An act to add Chapter 25.1 (commencing with Section 22757.20) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTAB 1064, as introduced, Bauer-Kahan. Leading Ethical AI Development (LEAD) for Kids Act.The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.The act would, among other things, require, on or before July 1, 2027, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a risk level assessment in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO Bill TextThe people of the State of California do enact as follows:SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter. CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Assembly Bill No. 1064Introduced by Assembly Member Bauer-KahanFebruary 20, 2025 An act to add Chapter 25.1 (commencing with Section 22757.20) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTAB 1064, as introduced, Bauer-Kahan. Leading Ethical AI Development (LEAD) for Kids Act.The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.The act would, among other things, require, on or before July 1, 2027, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a risk level assessment in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Assembly Bill No. 1064 Introduced by Assembly Member Bauer-KahanFebruary 20, 2025 Introduced by Assembly Member Bauer-Kahan February 20, 2025 An act to add Chapter 25.1 (commencing with Section 22757.20) to Division 8 of the Business and Professions Code, relating to artificial intelligence. LEGISLATIVE COUNSEL'S DIGEST ## LEGISLATIVE COUNSEL'S DIGEST AB 1064, as introduced, Bauer-Kahan. Leading Ethical AI Development (LEAD) for Kids Act. The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.The act would, among other things, require, on or before July 1, 2027, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a risk level assessment in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act. The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system. This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. The act would, among other things, require, on or before July 1, 2027, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a risk level assessment in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages. The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act. ## Digest Key ## Bill Text The people of the State of California do enact as follows:SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter. The people of the State of California do enact as follows: ## The people of the State of California do enact as follows: SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter. SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: ### SECTION 1. CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter. CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter. CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids 22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act. 22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act. 22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code. 22757.21. For purposes of this chapter: (a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits. (b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments. (c) Board means the LEAD for Kids Standards Board created pursuant to this chapter. (d) Child means a natural person under 18 years of age who resides in this state. (e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. (f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose. (g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product. (h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product. (i) Personal information has the meaning defined in Section 1798.140 of the Civil Code. (j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board. (k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event. (l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children. (m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs. (n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level. (o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code. 22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27. 22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged. (2) The board shall be composed of the following nine members: (A) A member of academia appointed by the Governor and subject to Senate confirmation. (B) A technologist appointed by the Governor and subject to Senate confirmation. (C) A member of civil society appointed by the Governor and subject to Senate confirmation. (D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation. (E) An expert in education appointed by the Governor and subject to Senate confirmation. (F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly. (G) A member of academia with expertise in social science appointed by the Speaker of the Assembly. (H) Two members appointed by the Senate Committee on Rules. (3) A member of the board shall meet all of the following criteria: (A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another. (B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties. (C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board. (4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years. (b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products. (2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas. (c) On or before January 1, 2027, the board shall adopt regulations governing all of the following: (1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter. (2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following: (A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following: (i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways. (ii) Artificial intelligence used in educational settings that collects or processes biometric data of children. (iii) Social scoring systems based on a childs behavior or personal characteristics. (iv) Artificial intelligence that detects the emotions of children. (v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage. (B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following: (i) Perform a function related to pupil assessment or discipline. (ii) Target advertisements to children. (iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety. (C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts. (D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts. (3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2). (4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product. (5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation. (6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children. (7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits. (8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board. (9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board. (10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter. (B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27. 22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board. 22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product: (1) Register the covered product using the registry developed by the board. (2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product. (3) Develop and implement an artificial intelligence information label for the covered product. (b) In addition to the duties required under subdivision (a), all of the following apply: (1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product. (2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board. (c) With respect to incident reports, a developer shall do all of the following: (1) Within 30 days of learning of an incident, file a report with the board with any required information. (2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website. (d) With respect to licensing the covered product to deployers, a developer shall do both of the following: (1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk. (2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1). (e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose. (f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product. (2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit. (3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board. 22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose. 22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product. (b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk. (c) With respect to incident reports, a deployer shall do both of the following: (1) Within 30 days of learning of the incident, file a report with the board with any required information. (2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website. (d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose. 22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter. 22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following: (a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter. (b) Retaliate against an employee for disclosing information under subdivision (a). (c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter. 22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper. 22757.26. (a) The board may refer violations of this chapter to the Attorney General. (b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General. (c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation. (2) Injunctive or declaratory relief. (3) Reasonable attorneys fees. (d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following: (1) Actual damages. (2) Punitive damages. (3) Reasonable attorneys fees and costs. (4) Injunctive or declaratory relief. (5) Any other relief the court deems proper. 22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter. 22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited. (b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.