California 2025-2026 Regular Session

California Senate Bill SB243 Compare Versions

OldNewDifferences
1-Amended IN Senate April 21, 2025 Amended IN Senate March 28, 2025 Amended IN Senate March 24, 2025 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Senate Bill No. 243Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)(Coauthors: Senators Stern and Weber Pierson)January 30, 2025An act to add Chapter 22.6 (commencing with Section 22601) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTSB 243, as amended, Padilla. Companion chatbots.Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying. Existing law authorizes the State Department of Public Health to establish the Office of Suicide Prevention in the department, as prescribed.This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.This bill would require an operator to annually report to the State Department of Health Care Services Office of Suicide Prevention certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. users, and would require the office to post data from that report on its internet website. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO Bill TextThe people of the State of California do enact as follows:SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d) Office means the Office of Suicide Prevention established pursuant to Section 131300 of the Health and Safety Code.(d)(e) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services office both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.(c) The office shall post data from a report required by this section on its internet website.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.
1+Amended IN Senate March 28, 2025 Amended IN Senate March 24, 2025 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Senate Bill No. 243Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)January 30, 2025An act to add Chapter 22.6 (commencing with Section 22601) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTSB 243, as amended, Padilla. Chatbots. Companion chatbots.Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying.This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.This bill would require an operator to annually report to the State Department of Health Care Services certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action to recover. action.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO Bill TextThe people of the State of California do enact as follows:SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.
22
3-Amended IN Senate April 21, 2025 Amended IN Senate March 28, 2025 Amended IN Senate March 24, 2025 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Senate Bill No. 243Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)(Coauthors: Senators Stern and Weber Pierson)January 30, 2025An act to add Chapter 22.6 (commencing with Section 22601) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTSB 243, as amended, Padilla. Companion chatbots.Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying. Existing law authorizes the State Department of Public Health to establish the Office of Suicide Prevention in the department, as prescribed.This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.This bill would require an operator to annually report to the State Department of Health Care Services Office of Suicide Prevention certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. users, and would require the office to post data from that report on its internet website. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO
3+ Amended IN Senate March 28, 2025 Amended IN Senate March 24, 2025 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Senate Bill No. 243Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)January 30, 2025An act to add Chapter 22.6 (commencing with Section 22601) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTSB 243, as amended, Padilla. Chatbots. Companion chatbots.Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying.This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.This bill would require an operator to annually report to the State Department of Health Care Services certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action to recover. action.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO
44
5-Amended IN Senate April 21, 2025 Amended IN Senate March 28, 2025 Amended IN Senate March 24, 2025
5+ Amended IN Senate March 28, 2025 Amended IN Senate March 24, 2025
66
7-Amended IN Senate April 21, 2025
87 Amended IN Senate March 28, 2025
98 Amended IN Senate March 24, 2025
10-
11-
129
1310 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION
1411
1512 Senate Bill
1613
1714 No. 243
1815
19-Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)(Coauthors: Senators Stern and Weber Pierson)January 30, 2025
16+Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)January 30, 2025
2017
21-Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)(Coauthors: Senators Stern and Weber Pierson)
18+Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)
2219 January 30, 2025
23-
24-
2520
2621 An act to add Chapter 22.6 (commencing with Section 22601) to Division 8 of the Business and Professions Code, relating to artificial intelligence.
2722
2823 LEGISLATIVE COUNSEL'S DIGEST
2924
3025 ## LEGISLATIVE COUNSEL'S DIGEST
3126
32-SB 243, as amended, Padilla. Companion chatbots.
27+SB 243, as amended, Padilla. Chatbots. Companion chatbots.
3328
34-Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying. Existing law authorizes the State Department of Public Health to establish the Office of Suicide Prevention in the department, as prescribed.This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.This bill would require an operator to annually report to the State Department of Health Care Services Office of Suicide Prevention certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. users, and would require the office to post data from that report on its internet website. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action.
29+Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying.This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.This bill would require an operator to annually report to the State Department of Health Care Services certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action to recover. action.
3530
36-Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying. Existing law authorizes the State Department of Public Health to establish the Office of Suicide Prevention in the department, as prescribed.
31+Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying.
3732
3833 This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.
3934
40-This bill would require an operator to annually report to the State Department of Health Care Services Office of Suicide Prevention certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. users, and would require the office to post data from that report on its internet website. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action.
35+This bill would require an operator to annually report to the State Department of Health Care Services certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action to recover. action.
4136
4237 ## Digest Key
4338
4439 ## Bill Text
4540
46-The people of the State of California do enact as follows:SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d) Office means the Office of Suicide Prevention established pursuant to Section 131300 of the Health and Safety Code.(d)(e) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services office both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.(c) The office shall post data from a report required by this section on its internet website.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.
41+The people of the State of California do enact as follows:SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.
4742
4843 The people of the State of California do enact as follows:
4944
5045 ## The people of the State of California do enact as follows:
5146
52-SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d) Office means the Office of Suicide Prevention established pursuant to Section 131300 of the Health and Safety Code.(d)(e) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services office both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.(c) The office shall post data from a report required by this section on its internet website.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.
47+SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.
5348
5449 SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read:
5550
5651 ### SECTION 1.
5752
58-CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d) Office means the Office of Suicide Prevention established pursuant to Section 131300 of the Health and Safety Code.(d)(e) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services office both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.(c) The office shall post data from a report required by this section on its internet website.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.
53+ CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.
5954
60-CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d) Office means the Office of Suicide Prevention established pursuant to Section 131300 of the Health and Safety Code.(d)(e) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services office both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.(c) The office shall post data from a report required by this section on its internet website.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.
55+ CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.
6156
6257 CHAPTER 22.6. Companion Chatbots
6358
6459 CHAPTER 22.6. Companion Chatbots
6560
66-##### CHAPTER 22.6. Companion Chatbots
61+22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.
6762
68-22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d) Office means the Office of Suicide Prevention established pursuant to Section 131300 of the Health and Safety Code.(d)(e) Operator means a person who makes a companion chatbot platform available to a user in the state.
63+
6964
7065 22601. As used in this chapter:
7166
72-###### 22601.
67+(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.
7368
74-(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.
69+(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.
70+
71+
7572
7673 (b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.
7774
78-(2) Companion chatbot does not include a bot that is used only for customer service purposes.
75+(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.
7976
80-(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.
77+(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.
8178
82-(d) Office means the Office of Suicide Prevention established pursuant to Section 131300 of the Health and Safety Code.
79+(d) Operator means a person who makes a companion chatbot platform available to a user in the state.
8380
84-(d)
81+22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.
8582
86-(e) Operator means a person who makes a companion chatbot platform available to a user in the state.
8783
88-22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.
8984
9085 22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates.
9186
92-###### 22602.
93-
94-(b) An operator shall issue a clear and conspicuous notification at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that the companion chatbot is artificially generated and not human.
87+ (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.
9588
9689 (c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.
9790
9891 (2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.
9992
100-22603. (a) An operator shall annually report to the State Department of Health Care Services office both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.(c) The office shall post data from a report required by this section on its internet website.
93+22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.
10194
102-22603. (a) An operator shall annually report to the State Department of Health Care Services office both of the following:
10395
104-###### 22603.
96+
97+22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:
10598
10699 (1) The number of times the operator has detected exhibitions of suicidal ideation by users.
107100
108101 (2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.
109102
110103 (b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.
111104
112-(c) The office shall post data from a report required by this section on its internet website.
105+22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.
106+
107+
113108
114109 22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.
115110
116-22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.
111+22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.
117112
118-###### 22604.
113+
119114
120115 22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.
121116
122-22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.
123-
124-###### 22605.
125-
126117 22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.
127118
119+
120+
128121 22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:
129-
130-###### 22606.
131122
132123 (a) Injunctive relief.
133124
134125 (b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.
135126
136127 (c) Reasonable attorneys fees and costs.
137128
138129 22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.
139130
131+
132+
140133 22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.
141-
142-###### 22607.
143134
144135 SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.
145136
146137 SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.
147138
148139 SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application.
149140
150141 ### SEC. 2.