California 2023-2024 Regular Session

California Senate Bill SB1235 Compare Versions

OldNewDifferences
1-Amended IN Senate April 15, 2024 CALIFORNIA LEGISLATURE 20232024 REGULAR SESSION Senate Bill No. 1235Introduced by Senator GonzalezFebruary 15, 2024 An act to add Chapter 8 (commencing with Section 99500) to Part 65 of Division 14 of Title 3 of the Education Code, relating to public postsecondary education. LEGISLATIVE COUNSEL'S DIGESTSB 1235, as amended, Gonzalez. Public postsecondary education: Artificial Intelligence and Deepfake Working Group.Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified California State University, Long Beach, in consultation with other public institution institutions of higher education education, to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. Group, and authorize California State University, Long Beach to develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The bill would require authorize California State University, Long Beach to include, as part of the working group to consist of group, at least one representative of 9 10 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO Bill TextThe people of the State of California do enact as follows:SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A The California State University, Long Beach, in consultation with other public institution institutions of higher education education, shall establish the Artificial Intelligence and Deepfake Working Group. (b)The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (b) The California State University, Long Beach may develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The scoping plan may address the following topics:(1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall California State University, Long Beach shall, if developing the scoping plan described in subdivision (b), solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government.(d) (1) The California State University, Long Beach may include, as part of the working group shall, at minimum, consist of group, at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (J) Ethicists or specialists in ethics.(2) The California State University, Long Beach may also include, as part of the working group shall also include group, all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to findings, including, but not limited to, research and findings related to the issues and impacts evaluated identified pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall California State University, Long Beach may consult with the Government Operations Agency and Agency, the California Privacy Protection Agency Agency, or any state or local agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
1+CALIFORNIA LEGISLATURE 20232024 REGULAR SESSION Senate Bill No. 1235Introduced by Senator GonzalezFebruary 15, 2024 An act to add Chapter 8 (commencing with Section 99500) to Part 65 of Division 14 of Title 3 of the Education Code, relating to public postsecondary education. LEGISLATIVE COUNSEL'S DIGESTSB 1235, as introduced, Gonzalez. Public postsecondary education: Artificial Intelligence and Deepfake Working Group.Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified public institution of higher education to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. The bill would require the working group to consist of at least one representative of 9 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO Bill TextThe people of the State of California do enact as follows:SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
22
3- Amended IN Senate April 15, 2024 CALIFORNIA LEGISLATURE 20232024 REGULAR SESSION Senate Bill No. 1235Introduced by Senator GonzalezFebruary 15, 2024 An act to add Chapter 8 (commencing with Section 99500) to Part 65 of Division 14 of Title 3 of the Education Code, relating to public postsecondary education. LEGISLATIVE COUNSEL'S DIGESTSB 1235, as amended, Gonzalez. Public postsecondary education: Artificial Intelligence and Deepfake Working Group.Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified California State University, Long Beach, in consultation with other public institution institutions of higher education education, to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. Group, and authorize California State University, Long Beach to develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The bill would require authorize California State University, Long Beach to include, as part of the working group to consist of group, at least one representative of 9 10 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO
3+ CALIFORNIA LEGISLATURE 20232024 REGULAR SESSION Senate Bill No. 1235Introduced by Senator GonzalezFebruary 15, 2024 An act to add Chapter 8 (commencing with Section 99500) to Part 65 of Division 14 of Title 3 of the Education Code, relating to public postsecondary education. LEGISLATIVE COUNSEL'S DIGESTSB 1235, as introduced, Gonzalez. Public postsecondary education: Artificial Intelligence and Deepfake Working Group.Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified public institution of higher education to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. The bill would require the working group to consist of at least one representative of 9 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO
44
5- Amended IN Senate April 15, 2024
65
7-Amended IN Senate April 15, 2024
6+
7+
88
99 CALIFORNIA LEGISLATURE 20232024 REGULAR SESSION
1010
1111 Senate Bill
1212
1313 No. 1235
1414
1515 Introduced by Senator GonzalezFebruary 15, 2024
1616
1717 Introduced by Senator Gonzalez
1818 February 15, 2024
1919
2020 An act to add Chapter 8 (commencing with Section 99500) to Part 65 of Division 14 of Title 3 of the Education Code, relating to public postsecondary education.
2121
2222 LEGISLATIVE COUNSEL'S DIGEST
2323
2424 ## LEGISLATIVE COUNSEL'S DIGEST
2525
26-SB 1235, as amended, Gonzalez. Public postsecondary education: Artificial Intelligence and Deepfake Working Group.
26+SB 1235, as introduced, Gonzalez. Public postsecondary education: Artificial Intelligence and Deepfake Working Group.
2727
28-Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified California State University, Long Beach, in consultation with other public institution institutions of higher education education, to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. Group, and authorize California State University, Long Beach to develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The bill would require authorize California State University, Long Beach to include, as part of the working group to consist of group, at least one representative of 9 10 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.
28+Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified public institution of higher education to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. The bill would require the working group to consist of at least one representative of 9 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.
2929
3030 Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state.
3131
3232 Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state.
3333
34-This bill would require an unspecified California State University, Long Beach, in consultation with other public institution institutions of higher education education, to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. Group, and authorize California State University, Long Beach to develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The bill would require authorize California State University, Long Beach to include, as part of the working group to consist of group, at least one representative of 9 10 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.
34+This bill would require an unspecified public institution of higher education to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. The bill would require the working group to consist of at least one representative of 9 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.
3535
3636 ## Digest Key
3737
3838 ## Bill Text
3939
40-The people of the State of California do enact as follows:SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A The California State University, Long Beach, in consultation with other public institution institutions of higher education education, shall establish the Artificial Intelligence and Deepfake Working Group. (b)The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (b) The California State University, Long Beach may develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The scoping plan may address the following topics:(1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall California State University, Long Beach shall, if developing the scoping plan described in subdivision (b), solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government.(d) (1) The California State University, Long Beach may include, as part of the working group shall, at minimum, consist of group, at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (J) Ethicists or specialists in ethics.(2) The California State University, Long Beach may also include, as part of the working group shall also include group, all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to findings, including, but not limited to, research and findings related to the issues and impacts evaluated identified pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall California State University, Long Beach may consult with the Government Operations Agency and Agency, the California Privacy Protection Agency Agency, or any state or local agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
40+The people of the State of California do enact as follows:SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
4141
4242 The people of the State of California do enact as follows:
4343
4444 ## The people of the State of California do enact as follows:
4545
46-SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A The California State University, Long Beach, in consultation with other public institution institutions of higher education education, shall establish the Artificial Intelligence and Deepfake Working Group. (b)The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (b) The California State University, Long Beach may develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The scoping plan may address the following topics:(1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall California State University, Long Beach shall, if developing the scoping plan described in subdivision (b), solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government.(d) (1) The California State University, Long Beach may include, as part of the working group shall, at minimum, consist of group, at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (J) Ethicists or specialists in ethics.(2) The California State University, Long Beach may also include, as part of the working group shall also include group, all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to findings, including, but not limited to, research and findings related to the issues and impacts evaluated identified pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall California State University, Long Beach may consult with the Government Operations Agency and Agency, the California Privacy Protection Agency Agency, or any state or local agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
46+SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
4747
4848 SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read:
4949
5050 ### SECTION 1.
5151
52- CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A The California State University, Long Beach, in consultation with other public institution institutions of higher education education, shall establish the Artificial Intelligence and Deepfake Working Group. (b)The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (b) The California State University, Long Beach may develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The scoping plan may address the following topics:(1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall California State University, Long Beach shall, if developing the scoping plan described in subdivision (b), solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government.(d) (1) The California State University, Long Beach may include, as part of the working group shall, at minimum, consist of group, at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (J) Ethicists or specialists in ethics.(2) The California State University, Long Beach may also include, as part of the working group shall also include group, all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to findings, including, but not limited to, research and findings related to the issues and impacts evaluated identified pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall California State University, Long Beach may consult with the Government Operations Agency and Agency, the California Privacy Protection Agency Agency, or any state or local agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
52+ CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
5353
54- CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A The California State University, Long Beach, in consultation with other public institution institutions of higher education education, shall establish the Artificial Intelligence and Deepfake Working Group. (b)The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (b) The California State University, Long Beach may develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The scoping plan may address the following topics:(1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall California State University, Long Beach shall, if developing the scoping plan described in subdivision (b), solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government.(d) (1) The California State University, Long Beach may include, as part of the working group shall, at minimum, consist of group, at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (J) Ethicists or specialists in ethics.(2) The California State University, Long Beach may also include, as part of the working group shall also include group, all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to findings, including, but not limited to, research and findings related to the issues and impacts evaluated identified pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall California State University, Long Beach may consult with the Government Operations Agency and Agency, the California Privacy Protection Agency Agency, or any state or local agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
54+ CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
5555
5656 CHAPTER 8. Artificial Intelligence and Deepfake Working Group
5757
5858 CHAPTER 8. Artificial Intelligence and Deepfake Working Group
5959
6060 99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document.
6161
6262
6363
6464 99500. As used in this article, the following definitions apply:
6565
6666 (a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following:
6767
6868 (1) Perceive real and virtual environments.
6969
7070 (2) Abstract those perceptions into models through analysis in an automated manner.
7171
7272 (3) Use model inferences to formulate options for information or action.
7373
7474 (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent.
7575
7676 (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead.
7777
7878 (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document.
7979
80-99501. (a) A The California State University, Long Beach, in consultation with other public institution institutions of higher education education, shall establish the Artificial Intelligence and Deepfake Working Group. (b)The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (b) The California State University, Long Beach may develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The scoping plan may address the following topics:(1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall California State University, Long Beach shall, if developing the scoping plan described in subdivision (b), solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government.(d) (1) The California State University, Long Beach may include, as part of the working group shall, at minimum, consist of group, at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (J) Ethicists or specialists in ethics.(2) The California State University, Long Beach may also include, as part of the working group shall also include group, all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to findings, including, but not limited to, research and findings related to the issues and impacts evaluated identified pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall California State University, Long Beach may consult with the Government Operations Agency and Agency, the California Privacy Protection Agency Agency, or any state or local agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
80+99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).
8181
8282
8383
84-99501. (a) A The California State University, Long Beach, in consultation with other public institution institutions of higher education education, shall establish the Artificial Intelligence and Deepfake Working Group.
84+99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group.
8585
8686 (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following:
87-
88-
89-
90-(b) The California State University, Long Beach may develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The scoping plan may address the following topics:
9187
9288 (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state.
9389
9490 (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state.
9591
9692 (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation.
9793
9894 (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections.
9995
10096 (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance.
10197
102-(c) The working group shall California State University, Long Beach shall, if developing the scoping plan described in subdivision (b), solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government.
98+(c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government.
10399
104-(d) (1) The California State University, Long Beach may include, as part of the working group shall, at minimum, consist of group, at least one representative from all of the following:
100+(d) (1) The working group shall, at minimum, consist of at least one representative from all of the following:
105101
106102 (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption.
107103
108104 (B) Organized labor unions.
109105
110106 (C) Nontechnology-related industries.
111107
112108 (D) The legal community who can advise on the legal implications of artificial intelligence.
113109
114110 (E) Privacy rights organizations.
115111
116112 (F) Consumer protection organizations.
117113
118114 (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects.
119115
120116 (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students.
121117
122118 (I) Experts in elections and election safety.
123119
124-(J) Ethicists or specialists in ethics.
125-
126-(2) The California State University, Long Beach may also include, as part of the working group shall also include group, all of the following:
120+(2) The working group shall also include all of the following:
127121
128122 (A) The Secretary of Government Operations or their designee.
129123
130124 (B) The Executive Director of the California Privacy Protection Agency or their designee.
131125
132126 (C) The Secretary of State or their designee.
133127
134-(e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to findings, including, but not limited to, research and findings related to the issues and impacts evaluated identified pursuant to subdivision (b). The working group shall make the report publically available.
128+(e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available.
135129
136130 (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code.
137131
138-(f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall California State University, Long Beach may consult with the Government Operations Agency and Agency, the California Privacy Protection Agency Agency, or any state or local agency on the establishment of the working group.
132+(f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group.
139133
140134 (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code).