Old | New | Differences | |
---|---|---|---|
1 | - | ||
1 | + | Amended IN Senate March 28, 2025 Amended IN Senate March 24, 2025 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Senate Bill No. 243Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)January 30, 2025An act to add Chapter 22.6 (commencing with Section 22601) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTSB 243, as amended, Padilla. Chatbots. Companion chatbots.Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying.This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.This bill would require an operator to annually report to the State Department of Health Care Services certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action to recover. action.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO Bill TextThe people of the State of California do enact as follows:SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application. | |
2 | 2 | ||
3 | - | ||
3 | + | Amended IN Senate March 28, 2025 Amended IN Senate March 24, 2025 CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION Senate Bill No. 243Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)January 30, 2025An act to add Chapter 22.6 (commencing with Section 22601) to Division 8 of the Business and Professions Code, relating to artificial intelligence.LEGISLATIVE COUNSEL'S DIGESTSB 243, as amended, Padilla. Chatbots. Companion chatbots.Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying.This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.This bill would require an operator to annually report to the State Department of Health Care Services certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action to recover. action.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO | |
4 | 4 | ||
5 | - | ||
5 | + | Amended IN Senate March 28, 2025 Amended IN Senate March 24, 2025 | |
6 | 6 | ||
7 | - | Amended IN Senate April 21, 2025 | |
8 | 7 | Amended IN Senate March 28, 2025 | |
9 | 8 | Amended IN Senate March 24, 2025 | |
10 | - | ||
11 | - | ||
12 | 9 | ||
13 | 10 | CALIFORNIA LEGISLATURE 20252026 REGULAR SESSION | |
14 | 11 | ||
15 | 12 | Senate Bill | |
16 | 13 | ||
17 | 14 | No. 243 | |
18 | 15 | ||
19 | - | Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson) | |
16 | + | Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson)January 30, 2025 | |
20 | 17 | ||
21 | - | Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson) | |
18 | + | Introduced by Senators Padilla and Becker(Coauthor: Senator Weber Pierson) | |
22 | 19 | January 30, 2025 | |
23 | - | ||
24 | - | ||
25 | 20 | ||
26 | 21 | An act to add Chapter 22.6 (commencing with Section 22601) to Division 8 of the Business and Professions Code, relating to artificial intelligence. | |
27 | 22 | ||
28 | 23 | LEGISLATIVE COUNSEL'S DIGEST | |
29 | 24 | ||
30 | 25 | ## LEGISLATIVE COUNSEL'S DIGEST | |
31 | 26 | ||
32 | - | SB 243, as amended, Padilla. Companion chatbots. | |
27 | + | SB 243, as amended, Padilla. Chatbots. Companion chatbots. | |
33 | 28 | ||
34 | - | Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying | |
29 | + | Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying.This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website.This bill would require an operator to annually report to the State Department of Health Care Services certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action to recover. action. | |
35 | 30 | ||
36 | - | Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying. | |
31 | + | Existing law requires a social media platform to take various steps to prevent cyberbullying of minors on the platform, including by requiring the platform to establish a prominent mechanism within its internet-based service that allows any individual, whether or not that individual has a profile on the internet-based service, to report cyberbullying or any content that violates the existing terms of service related to cyberbullying. | |
37 | 32 | ||
38 | 33 | This bill would, among other things related to making a companion chatbot platform safer for users, require an operator of a companion chatbot platform to take reasonable steps to prevent a companion chatbot, as defined, on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. The bill would also require an operator to prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, as specified, and publish details on that protocol on the operators internet website. | |
39 | 34 | ||
40 | - | This bill would require an operator to annually report to the State Department of Health Care Services | |
35 | + | This bill would require an operator to annually report to the State Department of Health Care Services certain things, including the number of times the operator has detected exhibitions of suicidal ideation by users. The bill would also require the operator to submit its companion chatbot platform to regular audits by a third party to ensure compliance with the bill. This bill would authorize a person who suffers injury in fact as a result of noncompliance with the bill to bring a certain civil action to recover. action. | |
41 | 36 | ||
42 | 37 | ## Digest Key | |
43 | 38 | ||
44 | 39 | ## Bill Text | |
45 | 40 | ||
46 | - | The people of the State of California do enact as follows:SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d | |
41 | + | The people of the State of California do enact as follows:SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law.SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application. | |
47 | 42 | ||
48 | 43 | The people of the State of California do enact as follows: | |
49 | 44 | ||
50 | 45 | ## The people of the State of California do enact as follows: | |
51 | 46 | ||
52 | - | SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d | |
47 | + | SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law. | |
53 | 48 | ||
54 | 49 | SECTION 1. Chapter 22.6 (commencing with Section 22601) is added to Division 8 of the Business and Professions Code, to read: | |
55 | 50 | ||
56 | 51 | ### SECTION 1. | |
57 | 52 | ||
58 | - | CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d | |
53 | + | CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law. | |
59 | 54 | ||
60 | - | CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Companion chatbot does not include a bot that is used only for customer service purposes.(c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots.(d | |
55 | + | CHAPTER 22.6. Companion Chatbots22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state.22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website.22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users.22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter.22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors.22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs.22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law. | |
61 | 56 | ||
62 | 57 | CHAPTER 22.6. Companion Chatbots | |
63 | 58 | ||
64 | 59 | CHAPTER 22.6. Companion Chatbots | |
65 | 60 | ||
66 | - | ||
61 | + | 22601. As used in this chapter:(a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments.(b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives.(b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions.(2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes.(c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots.(d) Operator means a person who makes a companion chatbot platform available to a user in the state. | |
67 | 62 | ||
68 | - | ||
63 | + | ||
69 | 64 | ||
70 | 65 | 22601. As used in this chapter: | |
71 | 66 | ||
72 | - | ||
67 | + | (a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments. | |
73 | 68 | ||
74 | - | (a) Artificial intelligence means an engineered or machine-based system that varies in its level of autonomy and that can, for explicit or implicit objectives, infer from the input it receives how to generate outputs that can influence physical or virtual environments. | |
69 | + | (b)(1)Chatbot means a virtual character that, enabled by artificial intelligence, is capable of engaging in open-ended dialogues with a user and seems to have a unique personality and perspectives. | |
70 | + | ||
71 | + | ||
75 | 72 | ||
76 | 73 | (b) (1) Companion chatbot means an artificial intelligence system with a natural language interface that provides adaptive, human-like responses to user inputs and is capable of meeting a users social needs, including by exhibiting anthropomorphic features and being able to sustain a relationship across multiple interactions. | |
77 | 74 | ||
78 | - | (2) Companion chatbot does not include a bot that is used only for customer service purposes. | |
75 | + | (2) Chatbot Companion chatbot does not include a bot that is used only for customer service purposes. | |
79 | 76 | ||
80 | - | (c) Companion chatbot platform means a platform that allows a user to engage with companion chatbots. | |
77 | + | (c) Chatbot Companion chatbot platform means a chatbot platform that allows a user to engage with companion chatbots. | |
81 | 78 | ||
82 | - | (d) | |
79 | + | (d) Operator means a person who makes a companion chatbot platform available to a user in the state. | |
83 | 80 | ||
84 | - | ( | |
81 | + | 22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website. | |
85 | 82 | ||
86 | - | (e) Operator means a person who makes a companion chatbot platform available to a user in the state. | |
87 | 83 | ||
88 | - | 22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. (b) An operator shall issue a clear and conspicuous notification at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that the companion chatbot is artificially generated and not human.(c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line.(2) The operator shall publish details on the protocol required by this subdivision on the operators internet website. | |
89 | 84 | ||
90 | 85 | 22602. (a) An operator shall take reasonable steps to prevent a companion chatbot on its companion chatbot platform from providing rewards to a user at unpredictable intervals or after an inconsistent number of actions or from encouraging increased engagement, usage, or response rates. | |
91 | 86 | ||
92 | - | ###### 22602. | |
93 | - | ||
94 | - | (b) An operator shall issue a clear and conspicuous notification at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that the companion chatbot is artificially generated and not human. | |
87 | + | (b) An operator shall issue a clear and conspicuous notification periodically at the beginning of any companion chatbot interaction, and at least every three hours during ongoing companion chatbot interactions thereafter, to remind a user that a the companion chatbot is artificially generated and not human. | |
95 | 88 | ||
96 | 89 | (c) (1) An operator shall prevent a companion chatbot on its companion chatbot platform from engaging with users unless the operator has implemented a protocol for addressing suicidal ideation, suicide, or self-harm expressed by a user to the companion chatbot, including, but not limited to, a notification to the user that refers the user to crisis service providers, including a suicide hotline or crisis text line. | |
97 | 90 | ||
98 | 91 | (2) The operator shall publish details on the protocol required by this subdivision on the operators internet website. | |
99 | 92 | ||
100 | - | 22603. (a) An operator shall annually report to the State Department of Health Care Services | |
93 | + | 22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following:(1) The number of times the operator has detected exhibitions of suicidal ideation by users.(2) The number of times a companion chatbot brought up suicidal ideation or actions with the user.(b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users. | |
101 | 94 | ||
102 | - | 22603. (a) An operator shall annually report to the State Department of Health Care Services office both of the following: | |
103 | 95 | ||
104 | - | ###### 22603. | |
96 | + | ||
97 | + | 22603. (a) An operator shall annually report to the State Department of Health Care Services both of the following: | |
105 | 98 | ||
106 | 99 | (1) The number of times the operator has detected exhibitions of suicidal ideation by users. | |
107 | 100 | ||
108 | 101 | (2) The number of times a companion chatbot brought up suicidal ideation or actions with the user. | |
109 | 102 | ||
110 | 103 | (b) The report required by this section shall include only the information listed in subdivision (a) and shall not include any identifiers or personal information about users. | |
111 | 104 | ||
112 | - | (c) The office shall post data from a report required by this section on its internet website. | |
105 | + | 22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter. | |
106 | + | ||
107 | + | ||
113 | 108 | ||
114 | 109 | 22604. An operator shall submit its companion chatbot platform to regular audits by a third party to ensure compliance with this chapter. | |
115 | 110 | ||
116 | - | ||
111 | + | 22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors. | |
117 | 112 | ||
118 | - | ||
113 | + | ||
119 | 114 | ||
120 | 115 | 22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors. | |
121 | 116 | ||
122 | - | 22605. An operator shall disclose to a user of its companion chatbot platform that companion chatbots may not be suitable for some minors. | |
123 | - | ||
124 | - | ###### 22605. | |
125 | - | ||
126 | 117 | 22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief:(a) Injunctive relief.(b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation.(c) Reasonable attorneys fees and costs. | |
127 | 118 | ||
119 | + | ||
120 | + | ||
128 | 121 | 22606. A person who suffers injury in fact as a result of a violation of this chapter may bring a civil action to recover all of the following relief: | |
129 | - | ||
130 | - | ###### 22606. | |
131 | 122 | ||
132 | 123 | (a) Injunctive relief. | |
133 | 124 | ||
134 | 125 | (b) Damages in an amount equal to the greater of actual damages or one thousand dollars ($1,000) per violation. | |
135 | 126 | ||
136 | 127 | (c) Reasonable attorneys fees and costs. | |
137 | 128 | ||
138 | 129 | 22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law. | |
139 | 130 | ||
131 | + | ||
132 | + | ||
140 | 133 | 22607. The duties, remedies, and obligations imposed by this chapter are cumulative to the duties, remedies, or obligations imposed under other law and shall not be construed to relieve an operator from any duties, remedies, or obligations imposed under any other law. | |
141 | - | ||
142 | - | ###### 22607. | |
143 | 134 | ||
144 | 135 | SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application. | |
145 | 136 | ||
146 | 137 | SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application. | |
147 | 138 | ||
148 | 139 | SEC. 2. The provisions of this act are severable. If any provision of this act or its application is held invalid, that invalidity shall not affect other provisions or applications that can be given effect without the invalid provision or application. | |
149 | 140 | ||
150 | 141 | ### SEC. 2. |