Old | New | Differences | |
---|---|---|---|
1 | - | ||
1 | + | CALIFORNIA LEGISLATURE 20232024 REGULAR SESSION Senate Bill No. 1235Introduced by Senator GonzalezFebruary 15, 2024 An act to add Chapter 8 (commencing with Section 99500) to Part 65 of Division 14 of Title 3 of the Education Code, relating to public postsecondary education. LEGISLATIVE COUNSEL'S DIGESTSB 1235, as introduced, Gonzalez. Public postsecondary education: Artificial Intelligence and Deepfake Working Group.Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified public institution of higher education to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. The bill would require the working group to consist of at least one representative of 9 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO Bill TextThe people of the State of California do enact as follows:SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code). | |
2 | 2 | ||
3 | - | ||
3 | + | CALIFORNIA LEGISLATURE 20232024 REGULAR SESSION Senate Bill No. 1235Introduced by Senator GonzalezFebruary 15, 2024 An act to add Chapter 8 (commencing with Section 99500) to Part 65 of Division 14 of Title 3 of the Education Code, relating to public postsecondary education. LEGISLATIVE COUNSEL'S DIGESTSB 1235, as introduced, Gonzalez. Public postsecondary education: Artificial Intelligence and Deepfake Working Group.Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified public institution of higher education to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. The bill would require the working group to consist of at least one representative of 9 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified.Digest Key Vote: MAJORITY Appropriation: NO Fiscal Committee: YES Local Program: NO | |
4 | 4 | ||
5 | - | Amended IN Senate April 15, 2024 | |
6 | 5 | ||
7 | - | Amended IN Senate April 15, 2024 | |
6 | + | ||
7 | + | ||
8 | 8 | ||
9 | 9 | CALIFORNIA LEGISLATURE 20232024 REGULAR SESSION | |
10 | 10 | ||
11 | 11 | Senate Bill | |
12 | 12 | ||
13 | 13 | No. 1235 | |
14 | 14 | ||
15 | 15 | Introduced by Senator GonzalezFebruary 15, 2024 | |
16 | 16 | ||
17 | 17 | Introduced by Senator Gonzalez | |
18 | 18 | February 15, 2024 | |
19 | 19 | ||
20 | 20 | An act to add Chapter 8 (commencing with Section 99500) to Part 65 of Division 14 of Title 3 of the Education Code, relating to public postsecondary education. | |
21 | 21 | ||
22 | 22 | LEGISLATIVE COUNSEL'S DIGEST | |
23 | 23 | ||
24 | 24 | ## LEGISLATIVE COUNSEL'S DIGEST | |
25 | 25 | ||
26 | - | SB 1235, as | |
26 | + | SB 1235, as introduced, Gonzalez. Public postsecondary education: Artificial Intelligence and Deepfake Working Group. | |
27 | 27 | ||
28 | - | Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified | |
28 | + | Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. This bill would require an unspecified public institution of higher education to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. The bill would require the working group to consist of at least one representative of 9 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified. | |
29 | 29 | ||
30 | 30 | Existing law requires the Secretary of Government Operations, upon appropriation by the Legislature, to evaluate, among other things, the impact the proliferation of deepfakes, as defined, has on state government, California-based businesses, and residents of the state, and the risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state. | |
31 | 31 | ||
32 | 32 | Existing law establishes the California Community Colleges, the California State University, and the University of California as the public segments of postsecondary education in the state. | |
33 | 33 | ||
34 | - | This bill would require an unspecified | |
34 | + | This bill would require an unspecified public institution of higher education to establish the Artificial Intelligence and Deepfake Working Group to evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, as provided. The bill would require the working group to consist of at least one representative of 9 specified interests, the Secretary of the Government Operations Agency, the Executive Director of the California Privacy Protection Agency, and the Secretary of State, or their designees. The bill would require the working group, on or before January 1, 2026, and annually thereafter, to submit a report to the Legislature on the working groups research and findings related to the relevant issues and impacts of artificial intelligence and deepfakes evaluated by the working group, as specified. | |
35 | 35 | ||
36 | 36 | ## Digest Key | |
37 | 37 | ||
38 | 38 | ## Bill Text | |
39 | 39 | ||
40 | - | The people of the State of California do enact as follows:SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A | |
40 | + | The people of the State of California do enact as follows:SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code). | |
41 | 41 | ||
42 | 42 | The people of the State of California do enact as follows: | |
43 | 43 | ||
44 | 44 | ## The people of the State of California do enact as follows: | |
45 | 45 | ||
46 | - | SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A | |
46 | + | SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code). | |
47 | 47 | ||
48 | 48 | SECTION 1. Chapter 8 (commencing with Section 99500) is added to Part 65 of Division 14 of Title 3 of the Education Code, to read: | |
49 | 49 | ||
50 | 50 | ### SECTION 1. | |
51 | 51 | ||
52 | - | CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A | |
52 | + | CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code). | |
53 | 53 | ||
54 | - | CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A | |
54 | + | CHAPTER 8. Artificial Intelligence and Deepfake Working Group99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code). | |
55 | 55 | ||
56 | 56 | CHAPTER 8. Artificial Intelligence and Deepfake Working Group | |
57 | 57 | ||
58 | 58 | CHAPTER 8. Artificial Intelligence and Deepfake Working Group | |
59 | 59 | ||
60 | 60 | 99500. As used in this article, the following definitions apply:(a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: (1) Perceive real and virtual environments. (2) Abstract those perceptions into models through analysis in an automated manner.(3) Use model inferences to formulate options for information or action. (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. | |
61 | 61 | ||
62 | 62 | ||
63 | 63 | ||
64 | 64 | 99500. As used in this article, the following definitions apply: | |
65 | 65 | ||
66 | 66 | (a) Artificial intelligence means a machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments. Artificial intelligence systems use machine- and human-based inputs to do all of the following: | |
67 | 67 | ||
68 | 68 | (1) Perceive real and virtual environments. | |
69 | 69 | ||
70 | 70 | (2) Abstract those perceptions into models through analysis in an automated manner. | |
71 | 71 | ||
72 | 72 | (3) Use model inferences to formulate options for information or action. | |
73 | 73 | ||
74 | 74 | (b) Deepfake means audio or visual content that has been generated or manipulated by artificial intelligence that would falsely appear to be authentic or truthful and that features depictions of people appearing to say or do things they did not say or do without their consent. | |
75 | 75 | ||
76 | 76 | (c) Digital content forgery means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead. | |
77 | 77 | ||
78 | 78 | (d) Digital content provenance means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document. | |
79 | 79 | ||
80 | - | 99501. (a) A | |
80 | + | 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. (B) Organized labor unions. (C) Nontechnology-related industries. (D) The legal community who can advise on the legal implications of artificial intelligence. (E) Privacy rights organizations. (F) Consumer protection organizations. (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. (I) Experts in elections and election safety. (2) The working group shall also include all of the following: (A) The Secretary of Government Operations or their designee.(B) The Executive Director of the California Privacy Protection Agency or their designee. (C) The Secretary of State or their designee. (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code). | |
81 | 81 | ||
82 | 82 | ||
83 | 83 | ||
84 | - | 99501. (a) A | |
84 | + | 99501. (a) A public institution of higher education shall establish the Artificial Intelligence and Deepfake Working Group. | |
85 | 85 | ||
86 | 86 | (b) The working group shall evaluate and advise the Legislature and the public on the relevant issues and impacts of artificial intelligence and deepfakes, including, but not limited to, all of the following: | |
87 | - | ||
88 | - | ||
89 | - | ||
90 | - | (b) The California State University, Long Beach may develop a scoping plan in the first year to establish the topics that may be evaluated by, and the stakeholders that may be included in, the working group. The scoping plan may address the following topics: | |
91 | 87 | ||
92 | 88 | (1) The impact of the proliferation of artificial intelligence and deepfakes on state and local government, California-based businesses and the workforce, elementary, secondary, and postsecondary education, and residents of the state. | |
93 | 89 | ||
94 | 90 | (2) The risks, including privacy risks, associated with artificial intelligence and the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, institutions of higher education, and residents of the state. | |
95 | 91 | ||
96 | 92 | (3) The potential impact on the workforce and strategies to protect employees and to prevent potential job loss due to artificial intelligence proliferation. | |
97 | 93 | ||
98 | 94 | (4) The impact of artificial intelligence, digital content forgery technologies, and deepfakes on civic engagement, including voting and elections. | |
99 | 95 | ||
100 | 96 | (5) The legal implications and privacy impacts associated with the use of artificial intelligence, digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance. | |
101 | 97 | ||
102 | - | (c) The working group shall | |
98 | + | (c) The working group shall solicit input from a broad range of stakeholders with a diverse range of interests affected by emerging artificial intelligence and deepfake technologies. The diverse range of interests shall include, but not be limited to, stakeholders representing privacy, business, consumer protection, courts, the legal community, academia, organized labor, the workforce, education, and state government. | |
103 | 99 | ||
104 | - | (d) (1) | |
100 | + | (d) (1) The working group shall, at minimum, consist of at least one representative from all of the following: | |
105 | 101 | ||
106 | 102 | (A) Those in the workforce impacted by potential job loss due to artificial intelligence adoption. | |
107 | 103 | ||
108 | 104 | (B) Organized labor unions. | |
109 | 105 | ||
110 | 106 | (C) Nontechnology-related industries. | |
111 | 107 | ||
112 | 108 | (D) The legal community who can advise on the legal implications of artificial intelligence. | |
113 | 109 | ||
114 | 110 | (E) Privacy rights organizations. | |
115 | 111 | ||
116 | 112 | (F) Consumer protection organizations. | |
117 | 113 | ||
118 | 114 | (G) The technology industry, with a technical focus that includes digital content, media manipulation, or related subjects. | |
119 | 115 | ||
120 | 116 | (H) Elementary, secondary, and postsecondary education, including staff, teachers, faculty, and students. | |
121 | 117 | ||
122 | 118 | (I) Experts in elections and election safety. | |
123 | 119 | ||
124 | - | (J) Ethicists or specialists in ethics. | |
125 | - | ||
126 | - | (2) The California State University, Long Beach may also include, as part of the working group shall also include group, all of the following: | |
120 | + | (2) The working group shall also include all of the following: | |
127 | 121 | ||
128 | 122 | (A) The Secretary of Government Operations or their designee. | |
129 | 123 | ||
130 | 124 | (B) The Executive Director of the California Privacy Protection Agency or their designee. | |
131 | 125 | ||
132 | 126 | (C) The Secretary of State or their designee. | |
133 | 127 | ||
134 | - | (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to | |
128 | + | (e) (1) Notwithstanding Section 10231.5 of the Government Code, on or before January 1, 2026, and annually thereafter, the working group shall submit a report to the Legislature on the working groups research and findings related to the issues and impacts evaluated pursuant to subdivision (b). The working group shall make the report publically available. | |
135 | 129 | ||
136 | 130 | (2) A report submitted pursuant to paragraph (1) shall be submitted in compliance with Section 9795 of the Government Code. | |
137 | 131 | ||
138 | - | (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall | |
132 | + | (f) The public institution of higher education that establishes the working group, as provided in subdivision (a), shall consult with the Government Operations Agency and the California Privacy Protection Agency on the establishment of the working group. | |
139 | 133 | ||
140 | 134 | (g) Meetings of the working group shall be subject to the Bagley-Keene Open Meeting Act (Article 9 (commencing with Section 11120) of Chapter 1 of Part 1 of Division 3 of Title 2 of the Government Code). |