California 2025-2026 Regular Session

California Assembly Bill AB1064 Compare Versions

OldNewDifferences
1-Amended IN Assembly April 10, 2025 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Assembly Bill No. 1064Introduced by Assembly Member Bauer-KahanFebruary 20, 2025 An act to add Chapter 25.1 (commencing with Section 22757.20) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTAB 1064, as amended, Bauer-Kahan. Leading Ethical AI Development (LEAD) for Kids Act.The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, 2028, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. children, used to process a childs personal information, or applied directly to a child, but the bill would also require the board to adopt regulations governing criteria for developers to determine if an artificial intelligence system is a covered product.The act would, among other things, require, on or before July 1, 2027, 2028, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, the use of a covered product, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO Bill TextThe people of the State of California do enact as follows:SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are impact means a significant negative impacts impact to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Biometric information has the meaning defined in Section 1798.140 of the Civil Code.(c)(d) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d)(e) Child means a natural person under 18 years of age who resides in this state.(f) Companion chatbot means a generative artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is intended to, or foreseeably will, be used to meet a users social needs, exhibits anthropomorphic features, and is able to sustain a relationship with the user across multiple interactions.(g) Consent means affirmative, written agreement to a specific purpose that is disclosed in clear and conspicuous terms to the parent or guardian.(e)(h) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. any of the following:(1) Used by children.(2) Used to process a childs personal information.(3) Applied directly to a child.(f)(i) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, developer that uses a covered product for a commercial or public purpose.(g)(j) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(k) Generative artificial intelligence means artificial intelligence that can generate derived synthetic content, including text, images, video, and audio, that emulates the structure and characteristics of the artificial intelligences training data.(h)(l) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i)(m) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j)Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k)(n) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences any adverse impact of the corresponding event.(l)(o) Risk level assessment means a structured evaluation of an artificial intelligences a covered products known or reasonably foreseeable risks to children.(p) Social score means an evaluation or classification of a child or group of children based on social behavior or personal characteristics for a purpose that is likely to result in an adverse impact to the child or children and is either of the following:(1) Unrelated to the context in which the information relating to the social behavior or personal characteristics was gathered.(2) Disproportionate or unjustified relative to the social behavior.(m)(q) Substantially modifies modify means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n)(r) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o)(s) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist An artificial intelligence developer or representative of a company that develops artificial intelligence systems appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) (i) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(ii) A members employment by a company that develops artificial intelligence does not by itself constitute a violation of this subparagraph.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public and state agencies who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant issue areas.(c) On or before January 1, 2027, 2028, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is a covered product subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i)Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii)Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii)Social scoring systems based on a childs behavior or personal characteristics.(iv)Artificial intelligence that detects the emotions of children.(v)Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(i) A companion chatbot that can foreseeably do any of the following:(I) Attempt to provide mental health therapy to the child.(II) Cause the child to develop a harmful ongoing emotional attachment to the companion chatbot.(III) Manipulate the child to engage in harmful behavior.(ii) A covered product used to do any of the following:(I) Collect or process a childs biometric information for any purpose other than confirming a childs identity, with the consent of the childs parent or guardian, in order to grant access to a service, unlock a device, or provide physical access to an educational institution.(II) Generate a social score.(III) (ia) Assess the emotional state of a child. (ib) This subclause does not apply to an assessment of the emotional state of a child in a medical setting with the consent of the childs parent or guardian or that is needed to provide emergency care if the childs parent or guardian is unavailable.(IV) Scrape an image that the developer or deployer knows, or reasonably should know, is a childs face from the internet or from surveillance footage without the consent of the childs parent or guardian.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do a covered product that does any of the following:(i) Perform Performs a function related to pupil assessment or discipline. discipline, including, but not limited to, a covered product that determines access or admission, assigns children to educational institutions or programs, evaluates learning outcomes of children, assesses the appropriate level of education for a child, materially influences the level of education a child will receive or be able to access, or monitors and detects prohibited behavior of students during tests.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that if the use is strictly necessary to prevent threats to ensure a childs mental or physical health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly reasonably outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, an assessment of the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance consistent with Section 22757.28 to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report potential incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, 2028, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence system information label for the covered product. product, as required by regulation.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that prevent children are not able to access from accessing the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change elevate the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not knowingly or recklessly use the personal information of a child to train a covered product with the personal information of a child unless the childs parent or guardian unless the child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, 2028, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that prevent a child is not able to access from accessing the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to enter a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, the use of a covered product, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.22757.28. (a) A developer or deployer who is required to comply with another law of this state that requires risk assessment of a covered product that is equally or more stringent than this chapter need not comply with any duplicative requirements under this chapter.(b) Before January 1, 2028, the board shall publish a description of the laws described by subdivision (a) and provide guidance to developers and deployers regarding compliance with subdivision (a).(c) A developer or deployer that relies on the guidance provided under subdivision (b) is presumed to be compliant with subdivision (a).
1+CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Assembly Bill No. 1064Introduced by Assembly Member Bauer-KahanFebruary 20, 2025 An act to add Chapter 25.1 (commencing with Section 22757.20) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTAB 1064, as introduced, Bauer-Kahan. Leading Ethical AI Development (LEAD) for Kids Act.The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.The act would, among other things, require, on or before July 1, 2027, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a risk level assessment in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO Bill TextThe people of the State of California do enact as follows:SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.
22
3- Amended IN Assembly April 10, 2025 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Assembly Bill No. 1064Introduced by Assembly Member Bauer-KahanFebruary 20, 2025 An act to add Chapter 25.1 (commencing with Section 22757.20) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTAB 1064, as amended, Bauer-Kahan. Leading Ethical AI Development (LEAD) for Kids Act.The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, 2028, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. children, used to process a childs personal information, or applied directly to a child, but the bill would also require the board to adopt regulations governing criteria for developers to determine if an artificial intelligence system is a covered product.The act would, among other things, require, on or before July 1, 2027, 2028, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, the use of a covered product, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO
3+ CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Assembly Bill No. 1064Introduced by Assembly Member Bauer-KahanFebruary 20, 2025 An act to add Chapter 25.1 (commencing with Section 22757.20) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTAB 1064, as introduced, Bauer-Kahan. Leading Ethical AI Development (LEAD) for Kids Act.The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.The act would, among other things, require, on or before July 1, 2027, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a risk level assessment in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO
44
5- Amended IN Assembly April 10, 2025
65
7-Amended IN Assembly April 10, 2025
6+
7+
88
99 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION
1010
1111 Assembly Bill
1212
1313 No. 1064
1414
1515 Introduced by Assembly Member Bauer-KahanFebruary 20, 2025
1616
1717 Introduced by Assembly Member Bauer-Kahan
1818 February 20, 2025
1919
2020 An act to add Chapter 25.1 (commencing with Section 22757.20) to Division 8 of the Business and Professions Code, relating to artificial intelligence.
2121
2222 LEGISLATIVE COUNSEL'S DIGEST
2323
2424 ## LEGISLATIVE COUNSEL'S DIGEST
2525
26-AB 1064, as amended, Bauer-Kahan. Leading Ethical AI Development (LEAD) for Kids Act.
26+AB 1064, as introduced, Bauer-Kahan. Leading Ethical AI Development (LEAD) for Kids Act.
2727
28-The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, 2028, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. children, used to process a childs personal information, or applied directly to a child, but the bill would also require the board to adopt regulations governing criteria for developers to determine if an artificial intelligence system is a covered product.The act would, among other things, require, on or before July 1, 2027, 2028, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, the use of a covered product, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act.
28+The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.The act would, among other things, require, on or before July 1, 2027, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a risk level assessment in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act.
2929
3030 The California AI Transparency Act requires a person that creates, codes, or otherwise produces a generative artificial intelligence system that has over 1,000,000 monthly visitors or users and is publicly accessible within the geographic boundaries of the state to make available an AI detection tool at no cost to the user that, among other things, allows a user to assess whether image, video, or audio content, or content that is any combination thereof, was created or altered by the covered providers generative artificial intelligence system.
3131
32-This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, 2028, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. children, used to process a childs personal information, or applied directly to a child, but the bill would also require the board to adopt regulations governing criteria for developers to determine if an artificial intelligence system is a covered product.
32+This bill, the Leading Ethical AI Development (LEAD) for Kids Act, would establish, and provide for the membership of, the LEAD for Kids Standards Board in the Government Operations Agency and require the Governor to appoint an executive officer of the board, subject to Senate confirmation, who would hold the office at the pleasure of the Governor. The act would require, on or before January 1, 2027, the board to adopt regulations governing, among other things, criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels, as prescribed. The act would define covered product to mean an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.
3333
34-The act would, among other things, require, on or before July 1, 2027, 2028, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, the use of a covered product, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.
34+The act would, among other things, require, on or before July 1, 2027, a developer to do certain things with respect to a covered product, including preparing and submitting to the board a risk level assessment in order to determine the appropriate risk classification of the covered product. The act would authorize the board to refer violations of the act to the Attorney General and would authorize the Attorney General to recover a certain civil penalty, as prescribed. The act would authorize a child who suffers actual harm as a result of a violation of the act, or a parent or guardian acting on behalf of that child, to bring a civil action to recover, among other relief, actual damages.
3535
3636 The act would create in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to the act is deposited and would make the moneys in the fund available, only upon appropriation by the Legislature, for the purpose of administering the act.
3737
3838 ## Digest Key
3939
4040 ## Bill Text
4141
42-The people of the State of California do enact as follows:SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are impact means a significant negative impacts impact to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Biometric information has the meaning defined in Section 1798.140 of the Civil Code.(c)(d) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d)(e) Child means a natural person under 18 years of age who resides in this state.(f) Companion chatbot means a generative artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is intended to, or foreseeably will, be used to meet a users social needs, exhibits anthropomorphic features, and is able to sustain a relationship with the user across multiple interactions.(g) Consent means affirmative, written agreement to a specific purpose that is disclosed in clear and conspicuous terms to the parent or guardian.(e)(h) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. any of the following:(1) Used by children.(2) Used to process a childs personal information.(3) Applied directly to a child.(f)(i) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, developer that uses a covered product for a commercial or public purpose.(g)(j) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(k) Generative artificial intelligence means artificial intelligence that can generate derived synthetic content, including text, images, video, and audio, that emulates the structure and characteristics of the artificial intelligences training data.(h)(l) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i)(m) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j)Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k)(n) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences any adverse impact of the corresponding event.(l)(o) Risk level assessment means a structured evaluation of an artificial intelligences a covered products known or reasonably foreseeable risks to children.(p) Social score means an evaluation or classification of a child or group of children based on social behavior or personal characteristics for a purpose that is likely to result in an adverse impact to the child or children and is either of the following:(1) Unrelated to the context in which the information relating to the social behavior or personal characteristics was gathered.(2) Disproportionate or unjustified relative to the social behavior.(m)(q) Substantially modifies modify means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n)(r) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o)(s) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist An artificial intelligence developer or representative of a company that develops artificial intelligence systems appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) (i) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(ii) A members employment by a company that develops artificial intelligence does not by itself constitute a violation of this subparagraph.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public and state agencies who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant issue areas.(c) On or before January 1, 2027, 2028, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is a covered product subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i)Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii)Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii)Social scoring systems based on a childs behavior or personal characteristics.(iv)Artificial intelligence that detects the emotions of children.(v)Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(i) A companion chatbot that can foreseeably do any of the following:(I) Attempt to provide mental health therapy to the child.(II) Cause the child to develop a harmful ongoing emotional attachment to the companion chatbot.(III) Manipulate the child to engage in harmful behavior.(ii) A covered product used to do any of the following:(I) Collect or process a childs biometric information for any purpose other than confirming a childs identity, with the consent of the childs parent or guardian, in order to grant access to a service, unlock a device, or provide physical access to an educational institution.(II) Generate a social score.(III) (ia) Assess the emotional state of a child. (ib) This subclause does not apply to an assessment of the emotional state of a child in a medical setting with the consent of the childs parent or guardian or that is needed to provide emergency care if the childs parent or guardian is unavailable.(IV) Scrape an image that the developer or deployer knows, or reasonably should know, is a childs face from the internet or from surveillance footage without the consent of the childs parent or guardian.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do a covered product that does any of the following:(i) Perform Performs a function related to pupil assessment or discipline. discipline, including, but not limited to, a covered product that determines access or admission, assigns children to educational institutions or programs, evaluates learning outcomes of children, assesses the appropriate level of education for a child, materially influences the level of education a child will receive or be able to access, or monitors and detects prohibited behavior of students during tests.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that if the use is strictly necessary to prevent threats to ensure a childs mental or physical health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly reasonably outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, an assessment of the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance consistent with Section 22757.28 to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report potential incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, 2028, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence system information label for the covered product. product, as required by regulation.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that prevent children are not able to access from accessing the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change elevate the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not knowingly or recklessly use the personal information of a child to train a covered product with the personal information of a child unless the childs parent or guardian unless the child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, 2028, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that prevent a child is not able to access from accessing the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to enter a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, the use of a covered product, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.22757.28. (a) A developer or deployer who is required to comply with another law of this state that requires risk assessment of a covered product that is equally or more stringent than this chapter need not comply with any duplicative requirements under this chapter.(b) Before January 1, 2028, the board shall publish a description of the laws described by subdivision (a) and provide guidance to developers and deployers regarding compliance with subdivision (a).(c) A developer or deployer that relies on the guidance provided under subdivision (b) is presumed to be compliant with subdivision (a).
42+The people of the State of California do enact as follows:SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.
4343
4444 The people of the State of California do enact as follows:
4545
4646 ## The people of the State of California do enact as follows:
4747
48-SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are impact means a significant negative impacts impact to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Biometric information has the meaning defined in Section 1798.140 of the Civil Code.(c)(d) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d)(e) Child means a natural person under 18 years of age who resides in this state.(f) Companion chatbot means a generative artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is intended to, or foreseeably will, be used to meet a users social needs, exhibits anthropomorphic features, and is able to sustain a relationship with the user across multiple interactions.(g) Consent means affirmative, written agreement to a specific purpose that is disclosed in clear and conspicuous terms to the parent or guardian.(e)(h) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. any of the following:(1) Used by children.(2) Used to process a childs personal information.(3) Applied directly to a child.(f)(i) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, developer that uses a covered product for a commercial or public purpose.(g)(j) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(k) Generative artificial intelligence means artificial intelligence that can generate derived synthetic content, including text, images, video, and audio, that emulates the structure and characteristics of the artificial intelligences training data.(h)(l) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i)(m) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j)Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k)(n) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences any adverse impact of the corresponding event.(l)(o) Risk level assessment means a structured evaluation of an artificial intelligences a covered products known or reasonably foreseeable risks to children.(p) Social score means an evaluation or classification of a child or group of children based on social behavior or personal characteristics for a purpose that is likely to result in an adverse impact to the child or children and is either of the following:(1) Unrelated to the context in which the information relating to the social behavior or personal characteristics was gathered.(2) Disproportionate or unjustified relative to the social behavior.(m)(q) Substantially modifies modify means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n)(r) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o)(s) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist An artificial intelligence developer or representative of a company that develops artificial intelligence systems appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) (i) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(ii) A members employment by a company that develops artificial intelligence does not by itself constitute a violation of this subparagraph.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public and state agencies who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant issue areas.(c) On or before January 1, 2027, 2028, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is a covered product subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i)Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii)Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii)Social scoring systems based on a childs behavior or personal characteristics.(iv)Artificial intelligence that detects the emotions of children.(v)Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(i) A companion chatbot that can foreseeably do any of the following:(I) Attempt to provide mental health therapy to the child.(II) Cause the child to develop a harmful ongoing emotional attachment to the companion chatbot.(III) Manipulate the child to engage in harmful behavior.(ii) A covered product used to do any of the following:(I) Collect or process a childs biometric information for any purpose other than confirming a childs identity, with the consent of the childs parent or guardian, in order to grant access to a service, unlock a device, or provide physical access to an educational institution.(II) Generate a social score.(III) (ia) Assess the emotional state of a child. (ib) This subclause does not apply to an assessment of the emotional state of a child in a medical setting with the consent of the childs parent or guardian or that is needed to provide emergency care if the childs parent or guardian is unavailable.(IV) Scrape an image that the developer or deployer knows, or reasonably should know, is a childs face from the internet or from surveillance footage without the consent of the childs parent or guardian.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do a covered product that does any of the following:(i) Perform Performs a function related to pupil assessment or discipline. discipline, including, but not limited to, a covered product that determines access or admission, assigns children to educational institutions or programs, evaluates learning outcomes of children, assesses the appropriate level of education for a child, materially influences the level of education a child will receive or be able to access, or monitors and detects prohibited behavior of students during tests.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that if the use is strictly necessary to prevent threats to ensure a childs mental or physical health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly reasonably outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, an assessment of the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance consistent with Section 22757.28 to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report potential incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, 2028, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence system information label for the covered product. product, as required by regulation.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that prevent children are not able to access from accessing the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change elevate the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not knowingly or recklessly use the personal information of a child to train a covered product with the personal information of a child unless the childs parent or guardian unless the child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, 2028, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that prevent a child is not able to access from accessing the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to enter a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, the use of a covered product, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.22757.28. (a) A developer or deployer who is required to comply with another law of this state that requires risk assessment of a covered product that is equally or more stringent than this chapter need not comply with any duplicative requirements under this chapter.(b) Before January 1, 2028, the board shall publish a description of the laws described by subdivision (a) and provide guidance to developers and deployers regarding compliance with subdivision (a).(c) A developer or deployer that relies on the guidance provided under subdivision (b) is presumed to be compliant with subdivision (a).
48+SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.
4949
5050 SECTION 1. Chapter 25.1 (commencing with Section 22757.20) is added to Division 8 of the Business and Professions Code, to read:
5151
5252 ### SECTION 1.
5353
54- CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are impact means a significant negative impacts impact to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Biometric information has the meaning defined in Section 1798.140 of the Civil Code.(c)(d) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d)(e) Child means a natural person under 18 years of age who resides in this state.(f) Companion chatbot means a generative artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is intended to, or foreseeably will, be used to meet a users social needs, exhibits anthropomorphic features, and is able to sustain a relationship with the user across multiple interactions.(g) Consent means affirmative, written agreement to a specific purpose that is disclosed in clear and conspicuous terms to the parent or guardian.(e)(h) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. any of the following:(1) Used by children.(2) Used to process a childs personal information.(3) Applied directly to a child.(f)(i) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, developer that uses a covered product for a commercial or public purpose.(g)(j) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(k) Generative artificial intelligence means artificial intelligence that can generate derived synthetic content, including text, images, video, and audio, that emulates the structure and characteristics of the artificial intelligences training data.(h)(l) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i)(m) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j)Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k)(n) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences any adverse impact of the corresponding event.(l)(o) Risk level assessment means a structured evaluation of an artificial intelligences a covered products known or reasonably foreseeable risks to children.(p) Social score means an evaluation or classification of a child or group of children based on social behavior or personal characteristics for a purpose that is likely to result in an adverse impact to the child or children and is either of the following:(1) Unrelated to the context in which the information relating to the social behavior or personal characteristics was gathered.(2) Disproportionate or unjustified relative to the social behavior.(m)(q) Substantially modifies modify means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n)(r) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o)(s) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist An artificial intelligence developer or representative of a company that develops artificial intelligence systems appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) (i) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(ii) A members employment by a company that develops artificial intelligence does not by itself constitute a violation of this subparagraph.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public and state agencies who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant issue areas.(c) On or before January 1, 2027, 2028, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is a covered product subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i)Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii)Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii)Social scoring systems based on a childs behavior or personal characteristics.(iv)Artificial intelligence that detects the emotions of children.(v)Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(i) A companion chatbot that can foreseeably do any of the following:(I) Attempt to provide mental health therapy to the child.(II) Cause the child to develop a harmful ongoing emotional attachment to the companion chatbot.(III) Manipulate the child to engage in harmful behavior.(ii) A covered product used to do any of the following:(I) Collect or process a childs biometric information for any purpose other than confirming a childs identity, with the consent of the childs parent or guardian, in order to grant access to a service, unlock a device, or provide physical access to an educational institution.(II) Generate a social score.(III) (ia) Assess the emotional state of a child. (ib) This subclause does not apply to an assessment of the emotional state of a child in a medical setting with the consent of the childs parent or guardian or that is needed to provide emergency care if the childs parent or guardian is unavailable.(IV) Scrape an image that the developer or deployer knows, or reasonably should know, is a childs face from the internet or from surveillance footage without the consent of the childs parent or guardian.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do a covered product that does any of the following:(i) Perform Performs a function related to pupil assessment or discipline. discipline, including, but not limited to, a covered product that determines access or admission, assigns children to educational institutions or programs, evaluates learning outcomes of children, assesses the appropriate level of education for a child, materially influences the level of education a child will receive or be able to access, or monitors and detects prohibited behavior of students during tests.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that if the use is strictly necessary to prevent threats to ensure a childs mental or physical health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly reasonably outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, an assessment of the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance consistent with Section 22757.28 to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report potential incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, 2028, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence system information label for the covered product. product, as required by regulation.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that prevent children are not able to access from accessing the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change elevate the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not knowingly or recklessly use the personal information of a child to train a covered product with the personal information of a child unless the childs parent or guardian unless the child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, 2028, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that prevent a child is not able to access from accessing the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to enter a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, the use of a covered product, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.22757.28. (a) A developer or deployer who is required to comply with another law of this state that requires risk assessment of a covered product that is equally or more stringent than this chapter need not comply with any duplicative requirements under this chapter.(b) Before January 1, 2028, the board shall publish a description of the laws described by subdivision (a) and provide guidance to developers and deployers regarding compliance with subdivision (a).(c) A developer or deployer that relies on the guidance provided under subdivision (b) is presumed to be compliant with subdivision (a).
54+ CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.
5555
56- CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are impact means a significant negative impacts impact to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Biometric information has the meaning defined in Section 1798.140 of the Civil Code.(c)(d) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d)(e) Child means a natural person under 18 years of age who resides in this state.(f) Companion chatbot means a generative artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is intended to, or foreseeably will, be used to meet a users social needs, exhibits anthropomorphic features, and is able to sustain a relationship with the user across multiple interactions.(g) Consent means affirmative, written agreement to a specific purpose that is disclosed in clear and conspicuous terms to the parent or guardian.(e)(h) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. any of the following:(1) Used by children.(2) Used to process a childs personal information.(3) Applied directly to a child.(f)(i) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, developer that uses a covered product for a commercial or public purpose.(g)(j) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(k) Generative artificial intelligence means artificial intelligence that can generate derived synthetic content, including text, images, video, and audio, that emulates the structure and characteristics of the artificial intelligences training data.(h)(l) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i)(m) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j)Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k)(n) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences any adverse impact of the corresponding event.(l)(o) Risk level assessment means a structured evaluation of an artificial intelligences a covered products known or reasonably foreseeable risks to children.(p) Social score means an evaluation or classification of a child or group of children based on social behavior or personal characteristics for a purpose that is likely to result in an adverse impact to the child or children and is either of the following:(1) Unrelated to the context in which the information relating to the social behavior or personal characteristics was gathered.(2) Disproportionate or unjustified relative to the social behavior.(m)(q) Substantially modifies modify means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n)(r) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o)(s) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist An artificial intelligence developer or representative of a company that develops artificial intelligence systems appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) (i) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(ii) A members employment by a company that develops artificial intelligence does not by itself constitute a violation of this subparagraph.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public and state agencies who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant issue areas.(c) On or before January 1, 2027, 2028, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is a covered product subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i)Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii)Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii)Social scoring systems based on a childs behavior or personal characteristics.(iv)Artificial intelligence that detects the emotions of children.(v)Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(i) A companion chatbot that can foreseeably do any of the following:(I) Attempt to provide mental health therapy to the child.(II) Cause the child to develop a harmful ongoing emotional attachment to the companion chatbot.(III) Manipulate the child to engage in harmful behavior.(ii) A covered product used to do any of the following:(I) Collect or process a childs biometric information for any purpose other than confirming a childs identity, with the consent of the childs parent or guardian, in order to grant access to a service, unlock a device, or provide physical access to an educational institution.(II) Generate a social score.(III) (ia) Assess the emotional state of a child. (ib) This subclause does not apply to an assessment of the emotional state of a child in a medical setting with the consent of the childs parent or guardian or that is needed to provide emergency care if the childs parent or guardian is unavailable.(IV) Scrape an image that the developer or deployer knows, or reasonably should know, is a childs face from the internet or from surveillance footage without the consent of the childs parent or guardian.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do a covered product that does any of the following:(i) Perform Performs a function related to pupil assessment or discipline. discipline, including, but not limited to, a covered product that determines access or admission, assigns children to educational institutions or programs, evaluates learning outcomes of children, assesses the appropriate level of education for a child, materially influences the level of education a child will receive or be able to access, or monitors and detects prohibited behavior of students during tests.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that if the use is strictly necessary to prevent threats to ensure a childs mental or physical health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly reasonably outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, an assessment of the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance consistent with Section 22757.28 to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report potential incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, 2028, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence system information label for the covered product. product, as required by regulation.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that prevent children are not able to access from accessing the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change elevate the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not knowingly or recklessly use the personal information of a child to train a covered product with the personal information of a child unless the childs parent or guardian unless the child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, 2028, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that prevent a child is not able to access from accessing the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to enter a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, the use of a covered product, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.22757.28. (a) A developer or deployer who is required to comply with another law of this state that requires risk assessment of a covered product that is equally or more stringent than this chapter need not comply with any duplicative requirements under this chapter.(b) Before January 1, 2028, the board shall publish a description of the laws described by subdivision (a) and provide guidance to developers and deployers regarding compliance with subdivision (a).(c) A developer or deployer that relies on the guidance provided under subdivision (b) is presumed to be compliant with subdivision (a).
56+ CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.
5757
5858 CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids
5959
6060 CHAPTER 25.1. Leading Ethical AI Development (LEAD) for Kids
6161
6262 22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.
6363
6464
6565
6666 22757.20. This chapter shall be known as the Leading Ethical AI Development (LEAD) for Kids Act.
6767
68-22757.21. For purposes of this chapter:(a) Adverse impacts are impact means a significant negative impacts impact to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Biometric information has the meaning defined in Section 1798.140 of the Civil Code.(c)(d) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d)(e) Child means a natural person under 18 years of age who resides in this state.(f) Companion chatbot means a generative artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is intended to, or foreseeably will, be used to meet a users social needs, exhibits anthropomorphic features, and is able to sustain a relationship with the user across multiple interactions.(g) Consent means affirmative, written agreement to a specific purpose that is disclosed in clear and conspicuous terms to the parent or guardian.(e)(h) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. any of the following:(1) Used by children.(2) Used to process a childs personal information.(3) Applied directly to a child.(f)(i) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, developer that uses a covered product for a commercial or public purpose.(g)(j) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(k) Generative artificial intelligence means artificial intelligence that can generate derived synthetic content, including text, images, video, and audio, that emulates the structure and characteristics of the artificial intelligences training data.(h)(l) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i)(m) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j)Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k)(n) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences any adverse impact of the corresponding event.(l)(o) Risk level assessment means a structured evaluation of an artificial intelligences a covered products known or reasonably foreseeable risks to children.(p) Social score means an evaluation or classification of a child or group of children based on social behavior or personal characteristics for a purpose that is likely to result in an adverse impact to the child or children and is either of the following:(1) Unrelated to the context in which the information relating to the social behavior or personal characteristics was gathered.(2) Disproportionate or unjustified relative to the social behavior.(m)(q) Substantially modifies modify means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n)(r) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o)(s) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.
68+22757.21. For purposes of this chapter:(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.(b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.(d) Child means a natural person under 18 years of age who resides in this state.(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.(j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.
6969
7070
7171
7272 22757.21. For purposes of this chapter:
7373
74-(a) Adverse impacts are impact means a significant negative impacts impact to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.
74+(a) Adverse impacts are significant negative impacts to a childs health, safety, privacy, educational opportunities or outcomes, or access to essential services or benefits.
7575
7676 (b) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.
7777
78-(c) Biometric information has the meaning defined in Section 1798.140 of the Civil Code.
78+(c) Board means the LEAD for Kids Standards Board created pursuant to this chapter.
7979
80-(c)
80+(d) Child means a natural person under 18 years of age who resides in this state.
8181
82+(e) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board.
8283
84+(f) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, that uses a covered product for a commercial or public purpose.
8385
84-(d) Board means the LEAD for Kids Standards Board created pursuant to this chapter.
86+(g) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.
8587
86-(d)
88+(h) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.
8789
88-
89-
90-(e) Child means a natural person under 18 years of age who resides in this state.
91-
92-(f) Companion chatbot means a generative artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is intended to, or foreseeably will, be used to meet a users social needs, exhibits anthropomorphic features, and is able to sustain a relationship with the user across multiple interactions.
93-
94-(g) Consent means affirmative, written agreement to a specific purpose that is disclosed in clear and conspicuous terms to the parent or guardian.
95-
96-(e)
97-
98-
99-
100-(h) Covered product means an artificial intelligence system that is intended to, or highly likely to, be used by children, pursuant to regulations adopted by the board. any of the following:
101-
102-(1) Used by children.
103-
104-(2) Used to process a childs personal information.
105-
106-(3) Applied directly to a child.
107-
108-(f)
109-
110-
111-
112-(i) Deployer means a person, partnership, state or local governmental agency, corporation, or developer, or any contract or agent of those entities, developer that uses a covered product for a commercial or public purpose.
113-
114-(g)
115-
116-
117-
118-(j) Developer means a person, partnership, state or local governmental agency, corporation, or deployer that designs, codes, substantially modifies, or otherwise produces a covered product.
119-
120-(k) Generative artificial intelligence means artificial intelligence that can generate derived synthetic content, including text, images, video, and audio, that emulates the structure and characteristics of the artificial intelligences training data.
121-
122-(h)
123-
124-
125-
126-(l) Incident means a discreet occurrence of an adverse impact to a child caused by a covered product.
127-
128-(i)
129-
130-
131-
132-(m) Personal information has the meaning defined in Section 1798.140 of the Civil Code.
90+(i) Personal information has the meaning defined in Section 1798.140 of the Civil Code.
13391
13492 (j) Prohibited covered product means a product that poses a prohibited risk pursuant to regulations adopted by the board.
13593
94+(k) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences of the corresponding event.
13695
96+(l) Risk level assessment means a structured evaluation of an artificial intelligences known or reasonably foreseeable risks to children.
13797
138-(k)
98+(m) Substantially modifies means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.
13999
100+(n) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.
140101
102+(o) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.
141103
142-(n) Risk means the composite measure of an events likelihood of occurring and the magnitude or degree of the consequences any adverse impact of the corresponding event.
143-
144-(l)
145-
146-
147-
148-(o) Risk level assessment means a structured evaluation of an artificial intelligences a covered products known or reasonably foreseeable risks to children.
149-
150-(p) Social score means an evaluation or classification of a child or group of children based on social behavior or personal characteristics for a purpose that is likely to result in an adverse impact to the child or children and is either of the following:
151-
152-(1) Unrelated to the context in which the information relating to the social behavior or personal characteristics was gathered.
153-
154-(2) Disproportionate or unjustified relative to the social behavior.
155-
156-(m)
157-
158-
159-
160-(q) Substantially modifies modify means to create a new version, release, update, or other modification to a covered product that materially changes its uses or outputs.
161-
162-(n)
163-
164-
165-
166-(r) System information label means a consumer-facing label that includes information about a covered products purpose, functioning, data sources, and risk level.
167-
168-(o)
169-
170-
171-
172-(s) Trade secrets has the meaning defined in Section 3426.1 of the Civil Code.
173-
174-22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist An artificial intelligence developer or representative of a company that develops artificial intelligence systems appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) (i) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(ii) A members employment by a company that develops artificial intelligence does not by itself constitute a violation of this subparagraph.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public and state agencies who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant issue areas.(c) On or before January 1, 2027, 2028, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is a covered product subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i)Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii)Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii)Social scoring systems based on a childs behavior or personal characteristics.(iv)Artificial intelligence that detects the emotions of children.(v)Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(i) A companion chatbot that can foreseeably do any of the following:(I) Attempt to provide mental health therapy to the child.(II) Cause the child to develop a harmful ongoing emotional attachment to the companion chatbot.(III) Manipulate the child to engage in harmful behavior.(ii) A covered product used to do any of the following:(I) Collect or process a childs biometric information for any purpose other than confirming a childs identity, with the consent of the childs parent or guardian, in order to grant access to a service, unlock a device, or provide physical access to an educational institution.(II) Generate a social score.(III) (ia) Assess the emotional state of a child. (ib) This subclause does not apply to an assessment of the emotional state of a child in a medical setting with the consent of the childs parent or guardian or that is needed to provide emergency care if the childs parent or guardian is unavailable.(IV) Scrape an image that the developer or deployer knows, or reasonably should know, is a childs face from the internet or from surveillance footage without the consent of the childs parent or guardian.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do a covered product that does any of the following:(i) Perform Performs a function related to pupil assessment or discipline. discipline, including, but not limited to, a covered product that determines access or admission, assigns children to educational institutions or programs, evaluates learning outcomes of children, assesses the appropriate level of education for a child, materially influences the level of education a child will receive or be able to access, or monitors and detects prohibited behavior of students during tests.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that if the use is strictly necessary to prevent threats to ensure a childs mental or physical health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly reasonably outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, an assessment of the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance consistent with Section 22757.28 to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report potential incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.
104+22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.(2) The board shall be composed of the following nine members:(A) A member of academia appointed by the Governor and subject to Senate confirmation.(B) A technologist appointed by the Governor and subject to Senate confirmation.(C) A member of civil society appointed by the Governor and subject to Senate confirmation.(D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.(E) An expert in education appointed by the Governor and subject to Senate confirmation.(F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.(G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.(H) Two members appointed by the Senate Committee on Rules.(3) A member of the board shall meet all of the following criteria:(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.(B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.(C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.(4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.(b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.(2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:(A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:(i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.(ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.(iii) Social scoring systems based on a childs behavior or personal characteristics.(iv) Artificial intelligence that detects the emotions of children.(v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:(i) Perform a function related to pupil assessment or discipline.(ii) Target advertisements to children.(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.(D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.(3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).(4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.(6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.(7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.(9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.(10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.(B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.
175105
176106
177107
178108 22757.22. (a) (1) There is hereby established the LEAD for Kids Standards Board in the Government Operations Agency. The Governor shall appoint an executive officer of the board, subject to Senate confirmation, who shall hold the office at the pleasure of the Governor. The executive officer shall be the administrative head of the board and shall exercise all duties and functions necessary to ensure that the responsibilities of the board are successfully discharged.
179109
180110 (2) The board shall be composed of the following nine members:
181111
182112 (A) A member of academia appointed by the Governor and subject to Senate confirmation.
183113
184-(B) A technologist An artificial intelligence developer or representative of a company that develops artificial intelligence systems appointed by the Governor and subject to Senate confirmation.
114+(B) A technologist appointed by the Governor and subject to Senate confirmation.
185115
186116 (C) A member of civil society appointed by the Governor and subject to Senate confirmation.
187117
188118 (D) An expert in technology ethics appointed by the Governor and subject to Senate confirmation.
189119
190120 (E) An expert in education appointed by the Governor and subject to Senate confirmation.
191121
192122 (F) A member of academia with expertise in artificial intelligence appointed by the Speaker of the Assembly.
193123
194124 (G) A member of academia with expertise in social science appointed by the Speaker of the Assembly.
195125
196126 (H) Two members appointed by the Senate Committee on Rules.
197127
198128 (3) A member of the board shall meet all of the following criteria:
199129
200-(A) (i) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.
201-
202-(ii) A members employment by a company that develops artificial intelligence does not by itself constitute a violation of this subparagraph.
130+(A) A member shall be free of direct and indirect external influence and shall not seek or take instructions from another.
203131
204132 (B) A member shall not take an action or engage in an occupation, whether gainful or not, that is incompatible with the members duties.
205133
206134 (C) A member shall not, either at the time of the members appointment or during the members term, have a financial interest in an entity that is subject to regulation by the board.
207135
208136 (4) A member of the board shall serve at the pleasure of the appointing authority but shall serve for no longer than eight consecutive years.
209137
210138 (b) (1) The board shall ensure that regulations adopted pursuant to this chapter are consistent with widely accepted standards for governance of artificial intelligence, taking into account technological standards, technological advances, scientific literature and advances, and societal changes as they pertain to risks posed to children by covered products.
211139
212-(2) The board shall consult with individuals from the public and state agencies who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant issue areas.
140+(2) The board shall consult with individuals from the public who possess expertise directly related to the boards functions, including technical, ethical, regulatory, and other relevant areas.
213141
214-(c) On or before January 1, 2027, 2028, the board shall adopt regulations governing all of the following:
142+(c) On or before January 1, 2027, the board shall adopt regulations governing all of the following:
215143
216-(1) Criteria for developers to determine if an artificial intelligence system is a covered product subject to this chapter.
144+(1) Criteria for developers to determine if an artificial intelligence system is subject to this chapter.
217145
218146 (2) Criteria for determining the level of estimated risk of a covered product based on an analysis that weighs the likelihood and severity of reasonably foreseeable adverse impacts against the anticipated benefits of the covered product and denominating the risk levels pursuant to all of the following:
219147
220148 (A) Prohibited risk, which shall be applied to a covered product for which the costs of foreseeable adverse impacts likely outweigh the benefits and includes, but is not limited to, all of the following:
221149
222150 (i) Anthropomorphic chatbots that offer companionship and are likely to cause the child to develop an ongoing emotional attachment or to manipulate the childs behavior in harmful ways.
223151
224-
225-
226152 (ii) Artificial intelligence used in educational settings that collects or processes biometric data of children.
227-
228-
229153
230154 (iii) Social scoring systems based on a childs behavior or personal characteristics.
231155
232-
233-
234156 (iv) Artificial intelligence that detects the emotions of children.
235-
236-
237157
238158 (v) Artificial intelligence used to develop facial recognition databases through untargeted scraping of childrens facial images from the internet or surveillance footage.
239159
160+(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do any of the following:
240161
241-
242-(i) A companion chatbot that can foreseeably do any of the following:
243-
244-(I) Attempt to provide mental health therapy to the child.
245-
246-(II) Cause the child to develop a harmful ongoing emotional attachment to the companion chatbot.
247-
248-(III) Manipulate the child to engage in harmful behavior.
249-
250-(ii) A covered product used to do any of the following:
251-
252-(I) Collect or process a childs biometric information for any purpose other than confirming a childs identity, with the consent of the childs parent or guardian, in order to grant access to a service, unlock a device, or provide physical access to an educational institution.
253-
254-(II) Generate a social score.
255-
256-(III) (ia) Assess the emotional state of a child.
257-
258- (ib) This subclause does not apply to an assessment of the emotional state of a child in a medical setting with the consent of the childs parent or guardian or that is needed to provide emergency care if the childs parent or guardian is unavailable.
259-
260-(IV) Scrape an image that the developer or deployer knows, or reasonably should know, is a childs face from the internet or from surveillance footage without the consent of the childs parent or guardian.
261-
262-(B) High risk, which shall be applied to a covered product for which the benefits may outweigh the costs of foreseeable adverse impacts and includes, but is not limited to, using artificial intelligence to do a covered product that does any of the following:
263-
264-(i) Perform Performs a function related to pupil assessment or discipline. discipline, including, but not limited to, a covered product that determines access or admission, assigns children to educational institutions or programs, evaluates learning outcomes of children, assesses the appropriate level of education for a child, materially influences the level of education a child will receive or be able to access, or monitors and detects prohibited behavior of students during tests.
162+(i) Perform a function related to pupil assessment or discipline.
265163
266164 (ii) Target advertisements to children.
267165
268-(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that if the use is strictly necessary to prevent threats to ensure a childs mental or physical health or safety.
166+(iii) For a specific purpose that would otherwise qualify as a prohibited risk, as set forth in regulations adopted by the board, provided that the use is strictly necessary to prevent threats to health or safety.
269167
270-(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly reasonably outweigh the costs of foreseeable adverse impacts.
168+(C) Moderate risk, which shall be applied to a covered product for which the benefits convincingly outweigh the costs of foreseeable adverse impacts.
271169
272170 (D) Low risk, which shall be applied to a covered product for which there are few, if any, foreseeable adverse impacts.
273171
274172 (3) Guidance for developers to classify covered products according to risk level, as described in paragraph (2).
275173
276174 (4) Reasonable steps a developer of a prohibited risk covered product is required to take to ensure that children are not able to access the product.
277175
278-(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, an assessment of the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance consistent with Section 22757.28 to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.
176+(5) Requirements for predeployment and postdeployment assessments, including, but not limited to, the purpose for which the covered product is intended, technical capabilities, limitations and functionality, specific adverse impacts, internal governance, and the timing for the development and submission to the board of those evaluations and assessments. The board shall also provide guidance to avoid duplication of efforts with respect to any other state or federal laws that require similar documentation.
279177
280178 (6) Requirements for artificial intelligence information labels to ensure that, for each covered product, the public is able to access baseline information on the covered product, including the covered products purpose, a description of how it works, its risk level, potential adverse impacts, and any other information necessary to assess the impact of the system on children.
281179
282180 (7) Standards for audits of covered products, including the timing of audits, qualifications and training of auditors, rules governing auditor independence and oversight, and audit reports that auditors are required to provide to the board. The board shall also establish rules for the protection of trade secrets in connection with the performance of audits.
283181
284-(8) The creation of an incident reporting mechanism that enables third parties to report potential incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.
182+(8) The creation of an incident reporting mechanism that enables third parties to report incidents of adverse impacts resulting from the use of a covered product directly to a developer or the board.
285183
286184 (9) The creation of a publicly accessible registry for covered products that contains high-level summaries of audit reports, incident reports, system information labels, and any additional information specified by the board.
287185
288186 (10) (A) Registration fees for developers that do not exceed the reasonable regulatory costs incident to administering this chapter.
289187
290188 (B) A registration fee described by this paragraph shall be deposited into the LEAD for Kids AI Fund established pursuant to Section 22757.27.
291189
292-22757.23. (a) On or before July 1, 2027, 2028, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence system information label for the covered product. product, as required by regulation.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that prevent children are not able to access from accessing the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change elevate the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not knowingly or recklessly use the personal information of a child to train a covered product with the personal information of a child unless the childs parent or guardian unless the child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, 2028, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.
190+22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:(1) Register the covered product using the registry developed by the board.(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.(3) Develop and implement an artificial intelligence information label for the covered product.(b) In addition to the duties required under subdivision (a), all of the following apply:(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.(2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.(c) With respect to incident reports, a developer shall do all of the following:(1) Within 30 days of learning of an incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.(d) With respect to licensing the covered product to deployers, a developer shall do both of the following:(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.(2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.(2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.(3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.
293191
294192
295193
296-22757.23. (a) On or before July 1, 2027, 2028, a developer shall do all of the following with respect to a covered product:
194+22757.23. (a) On or before July 1, 2027, a developer shall do all of the following with respect to a covered product:
297195
298196 (1) Register the covered product using the registry developed by the board.
299197
300-(2) Prepare and submit to the board a any risk level assessment required by regulation in order to determine the appropriate risk classification of the covered product.
198+(2) Prepare and submit to the board a risk level assessment in order to determine the appropriate risk classification of the covered product.
301199
302-(3) Develop and implement an artificial intelligence system information label for the covered product. product, as required by regulation.
200+(3) Develop and implement an artificial intelligence information label for the covered product.
303201
304202 (b) In addition to the duties required under subdivision (a), all of the following apply:
305203
306-(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that prevent children are not able to access from accessing the product.
204+(1) With respect to a covered product that poses a prohibited risk, the developer shall take reasonable steps to ensure that children are not able to access the product.
307205
308206 (2) With respect to a high-risk covered product, the developer shall conduct predeployment and postdeployment assessments pursuant to the requirements established by the board.
309207
310208 (c) With respect to incident reports, a developer shall do all of the following:
311209
312210 (1) Within 30 days of learning of an incident, file a report with the board with any required information.
313211
314212 (2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the developers internet website.
315213
316214 (d) With respect to licensing the covered product to deployers, a developer shall do both of the following:
317215
318-(1) Ensure that the terms of the license require it to be used in a manner that would not change elevate the covered products risk level to a higher level of risk.
216+(1) Ensure that the terms of the license require it to be used in a manner that would not change the covered products risk level to a higher level of risk.
319217
320218 (2) Revoke the license if the developer knows, or should know, that the deployer is using the covered product in a manner that is inconsistent with the terms required under paragraph (1).
321219
322-(e) A developer shall not knowingly or recklessly use the personal information of a child to train a covered product with the personal information of a child unless the childs parent or guardian unless the child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.
220+(e) A developer shall not train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the developer to use the childs personal information for that specific purpose.
323221
324-(f) (1) On or after July 1, 2027, 2028, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.
222+(f) (1) On or after July 1, 2027, a developer shall submit a covered product it develops to an independent third party audit on a schedule determined by the board according to the risk level posed by the covered product.
325223
326224 (2) A developer whose covered product is subject to an audit shall provide the auditor with all necessary documentation and information for the auditor to perform the audit.
327225
328226 (3) If an auditor discovers substantial noncompliance with this chapter, the auditor shall promptly notify the board.
329227
330-22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that prevent a child is not able to access from accessing the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to enter a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.
228+22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.(b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.(c) With respect to incident reports, a deployer shall do both of the following:(1) Within 30 days of learning of the incident, file a report with the board with any required information.(2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.
331229
332230
333231
334-22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that prevent a child is not able to access from accessing the product.
232+22757.24. (a) A deployer of a prohibited risk covered product shall implement any applicable procedures adopted by the developer to ensure that a child is not able to access the product.
335233
336234 (b) A deployer of a covered product shall publicly display developer license usage requirements. A deployers usage requirements shall not change the covered products risk level to a higher level of risk.
337235
338236 (c) With respect to incident reports, a deployer shall do both of the following:
339237
340238 (1) Within 30 days of learning of the incident, file a report with the board with any required information.
341239
342240 (2) Within 30 days of the substantiation of the incident by the board, file a description of the incident on the deployers internet website.
343241
344-(d) A deployer shall not opt in to enter a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian child, if the child is at least 13 years of age and less than 16 years of age, or the childs parent or guardian, if the child is less than 13 years of age, has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.
242+(d) A deployer shall not opt in to a data sharing agreement that allows the developer to train a covered product with the personal information of a child unless the childs parent or guardian has affirmatively provided written consent to the deployer to use the childs personal information for that specific purpose.
345243
346244 22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:(a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.(b) Retaliate against an employee for disclosing information under subdivision (a).(c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.
347245
348246
349247
350248 22757.25. A developer or deployer, or any contractor or subcontractor of a developer or deployer, shall not do any of the following:
351249
352250 (a) Prevent an employee from disclosing information to the Attorney General pertaining to a reasonable belief supporting the existence of a potential violation of this chapter.
353251
354252 (b) Retaliate against an employee for disclosing information under subdivision (a).
355253
356254 (c) Make false or materially misleading statements related to its compliance with obligations imposed under this chapter.
357255
358-22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, the use of a covered product, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.
256+22757.26. (a) The board may refer violations of this chapter to the Attorney General.(b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.(c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following: (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.(2) Injunctive or declaratory relief.(3) Reasonable attorneys fees.(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:(1) Actual damages.(2) Punitive damages.(3) Reasonable attorneys fees and costs.(4) Injunctive or declaratory relief.(5) Any other relief the court deems proper.
359257
360258
361259
362260 22757.26. (a) The board may refer violations of this chapter to the Attorney General.
363261
364262 (b) With respect to violations related to the risk level classification of a covered product, the board may allow the developer to take corrective action if the board determines that the circumstances indicate that the erroneous classification was neither unreasonable nor in bad faith. If the developer fails to do so within 30 days, the board may refer the matter to the Attorney General.
365263
366264 (c) Upon receiving a referral from the board, the Attorney General may bring an action for all of the following:
367265
368266 (1) A civil penalty of twenty-five thousand dollars ($25,000) for each violation.
369267
370268 (2) Injunctive or declaratory relief.
371269
372270 (3) Reasonable attorneys fees.
373271
374-(d) A child who suffers actual harm as a result of a violation of this chapter, the use of a covered product, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:
272+(d) A child who suffers actual harm as a result of a violation of this chapter, or a parent or guardian acting on behalf of that child, may bring a civil action to recover all of the following:
375273
376274 (1) Actual damages.
377275
378276 (2) Punitive damages.
379277
380278 (3) Reasonable attorneys fees and costs.
381279
382280 (4) Injunctive or declaratory relief.
383281
384282 (5) Any other relief the court deems proper.
385283
386284 22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.(b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.
387285
388286
389287
390288 22757.27. (a) There is hereby created in the State Treasury the LEAD for Kids AI Fund into which any civil penalty recovered by the Attorney General pursuant to Section 22757.26 shall be deposited.
391289
392290 (b) Moneys in the fund shall be available, only upon appropriation by the Legislature, for the purpose of administering this chapter.
393-
394-22757.28. (a) A developer or deployer who is required to comply with another law of this state that requires risk assessment of a covered product that is equally or more stringent than this chapter need not comply with any duplicative requirements under this chapter.(b) Before January 1, 2028, the board shall publish a description of the laws described by subdivision (a) and provide guidance to developers and deployers regarding compliance with subdivision (a).(c) A developer or deployer that relies on the guidance provided under subdivision (b) is presumed to be compliant with subdivision (a).
395-
396-
397-
398-22757.28. (a) A developer or deployer who is required to comply with another law of this state that requires risk assessment of a covered product that is equally or more stringent than this chapter need not comply with any duplicative requirements under this chapter.
399-
400-(b) Before January 1, 2028, the board shall publish a description of the laws described by subdivision (a) and provide guidance to developers and deployers regarding compliance with subdivision (a).
401-
402-(c) A developer or deployer that relies on the guidance provided under subdivision (b) is presumed to be compliant with subdivision (a).