This docum ent does not reflect the intent or official position of the bill sponsor or House of Representatives. STORAGE NAME: h1459c.JDC DATE: 2/19/2024 HOUSE OF REPRESENTATIVES STAFF ANALYSIS BILL #: CS/CS/HB 1459 Advanced Technology SPONSOR(S): Appropriations Committee, Commerce Committee, McFarland TIED BILLS: CS/HB 1461 IDEN./SIM. BILLS: REFERENCE ACTION ANALYST STAFF DIRECTOR or BUDGET/POLICY CHIEF 1) Commerce Committee 20 Y, 0 N, As CS Wright Hamon 2) Appropriations Committee 25 Y, 0 N, As CS Mullins Pridgeon 3) Judiciary Committee Leshko Kramer SUMMARY ANALYSIS Artificial intelligence (AI) encompasses a large field of existing and emerging technologies, methodologies, and application areas. AI is generally thought of as computerized systems that work and react in ways commonly thought to require intelligence. The application of AI extends to areas such as natural language processing, facial recognition, and robotics. As the use of AI technologies has grown, so too have discussions of whether and how to regulate them. Potential regulatory options include a broad regulation of AI technologies that could be used across sectors, or a more targeted approach, regulating its use in particular sectors. CS/CS/HB 1459 creates s. 282.802, F.S., to establish an advisory council called the Government Technology Modernization Council to study and monitor the development and deployment of new technologies and provide an annual report including recommendations on procuring and regulating such systems to the Governor and the Legislature. The bill also creates s. 501.174, F.S., to: Require an entity or person who produces or offers for use or interaction AI content or technology for a commercial purpose, and makes such content or technology available to the Florida public, to create safety and transparency standards that: o Alert consumers that such content or technology is generated by AI. o Allow such content or technology to be recognizable as generated by AI to other AI. Require an entity or a person to provide a clear and conspicuous notice on its Internet homepage or landing page if it provides an AI mechanism to communicate or interact with Florida consumers for a commercial purpose. Prohibit any entity or person from knowingly producing, generating, incorporating, or synthesizing child pornography through AI using an image of an identifiable child. Require any state agency that uses AI to disclose if a person is interacting with AI when interacting with the agency and ensure that any confidential information accessible to an AI system remains confidential. Under the bill, any violation of the AI transparency requirements by a person or entity is considered an unfair and deceptive trade practice actionable under the Florida Deceptive and Unfair Trade Practices Act solely by the Department of Legal Affairs. The bill does not establish a private cause of action. The bill amends ss. 775.0847 and 827.071, F.S., to expand the definition of child pornography to include “any image or presentation produced, generated, incorporated, or synthesized through artificial intelligence that uses an image of an identifiable minor to depict or portray a minor engaged in sexual conduct,” thereby prohibiting the production, possession, control, intentional viewing, promotion, or transmission of such an image as a criminal offense. The bill may have an indeterminate fiscal impact on state government and the private sector. See Fiscal Comments. The bill provides an effective date of July 1, 2024. STORAGE NAME: h1459c.JDC PAGE: 2 DATE: 2/19/2024 FULL ANALYSIS I. SUBSTANTIVE ANALYSIS A. EFFECT OF PROPOSED CHANGES: Current Situation Artificial Intelligence In the 1950s, a generation of scientists, mathematicians, and philosophers, including Alan Turing, conceptualized the possibility of artificial intelligence (AI). In his 1950 paper Computing Machinery and Intelligence, Turing discussed “how to build intelligent machines and how to test their intelligence.” 1 The term “artificial intelligence” itself was coined at the Dartmouth Summer Research Project on Artificial Intelligence, a conference held in 1956. Since 2010, there have been many advancements in AI research which have been attributed to the “availability of large datasets, improved machine learning approaches and algorithms, and more powerful computers.” 2 AI encompasses a large field of existing and emerging technologies, methodologies, and application areas. AI is “generally thought of as computerized systems that work and react in ways commonly thought to require intelligence.” 3 The application of AI extends to areas such as “natural language processing, facial recognition, and robotics.” 4 Generative Artificial Intelligence Generative AI (GenAI), which refers to “machine learning models developed through training on large volumes of data” for the purpose of generating new content, has undergone rapid advancement over the past few years. 5 GenAI uses advanced machine learning models 6 such as large language models and generative adversarial networks (GANs) to generate text, images, video, and computer code responses with “human-like quality” based on user prompts. 7 Recent technological advances combined with the open availability of these tools to the public has led to widespread use. 8 Specifically, GANs synthesize content by pitting two neural networks 9 —a generator and discriminator— against each other. “To synthesize an image of a fictional person, the generator starts with a random array of pixels and iteratively learns to synthesize a realistic face. On each iteration, the discriminator learns to distinguish the synthesized face from a corpus of real faces; if the synthesized face is distinguishable from the real faces, then the discriminator penalizes the generator. Over multiple 1 Rockwell Anyoha, Can Machines Think?, Harvard University, Aug. 28, 2017, https://sitn.hms.harvard.edu/flash/2017/history-artificial-intelligence/ (last visited Feb. 16, 2024). 2 Congressional Research Service (CRS), Artificial Intelligence: Overview, Recent Advances, and Considerations for the 118 th Congress, https://crsreports.congress.gov/product/pdf/R/R47644 (last visited Feb. 16, 2024). 3 Id. 4 Id. 5 Id.; See also CRS, Generative Artificial Intelligence: Overview, Issues, and Questions for Congress, https://crsreports.congress.gov/product/pdf/IF/IF12426 (last visited Feb. 16, 2024). 6 Advanced machine learning models are designed to understand, interpret, generate, and respond to human language in a way that is as close to human-like communication as possible. Yana Ihnatchyck, Introduction to GenAI: What are LLM Models, and How Are They Used in GenAI?, Data Floq (Oct. 27, 2023), https://datafloq.com/read/introduction-gen-ai-llm-models/ (last visited Feb. 16, 2024). 7 CRS, supra note 2.; Scribble Data, GenAI vs. LLMs vs. NLP: A Complete Guide, https://www.scribbledata.io/blog/genai-vs-llms-vs-nlp-a-complete- guide/ (last visited Feb. 16, 2024). 8 CRS, supra note 2. 9 Neural networks, a subset of machine learning, are computational models that mimic the complex functions of the human brain. The neural networks consist of interconnected nodes or neurons that process and learn from data, enabling tasks such as pattern recognition and decision making in machine learning. Geeks for Geeks, What is a neural network?, https://www.geeksforgeeks.org/neural-networks-a-beginners-guide/ (last visited Feb. 16, 2024); see IBM, What is a neural network?, https://www.ibm.com/topics/neural-networks (last visited Feb. 16, 2024). STORAGE NAME: h1459c.JDC PAGE: 3 DATE: 2/19/2024 iterations, the generator learns to synthesize increasingly more realistic faces until the discriminator is unable to distinguish it from real faces.” 10 Potential Benefits and Risks of Artificial Intelligence It has been estimated that “AI technologies could increase global GDP by $15.7 trillion, a full 14 [percent], by 2030,” with health, retail, and financial services experiencing the most growth. 11 The use of AI and algorithms may benefit various sectors and services by: Financial sector 12 o Making decision-making relating to investing, portfolio management, loan applications, mortgages, and retirement planning more efficient, less emotional, and more analytic. o Preventing fraud and detecting financial anomalies in large institutions. Health Sector o Helping diagnose and predict disease or illness. o Helping predict potential challenges and allocating resources to patient education, sensing, and proactive interventions to keep patients out of the hospital. o Creating a multifaceted and highly personalized picture of a person’s well-being. Transportation Sector o Developing vehicle guidance, braking, and lane-changing systems for cars, trucks, buses, and drone delivery systems. o Developing systems to prevent collisions with the use of cameras and sensors. o Providing real-time information analysis and safety measures for the development of autonomous vehicles. Government Sector o Helping to create smart cities and e-governance. Examples of e-governance include: The George AI chatbot, a customer service virtual assistant created by the Georgia Department of Labor. AI monitoring of live footage from cameras in forests and mountains for signs of smoke by western states including California, Nevada, and Oregon. o Helping metropolitan areas adopt systems for citizen service delivery, urban and environmental planning, energy use, and crime prevention. Customer Service 13 o Providing customer service to consumers through the use of chatbots and other customer service-oriented tools to increase customer engagement, resulting in increased sales opportunities with reduced costs to the business. However, developments in AI also raise important policy, regulatory, and ethical issues. Potential risks are associated with removing humans from the decision-making process, as may be the case when AI technology becomes more advanced over time. Some potential risks include: Bias o Because AI algorithms are based on training data input by humans, and because the initial data collection and actual data itself is based on human choices, responses, or decisions, there is a risk that such algorithms can contain inaccuracies and bias, which may take many forms including historical, racial, or other discrimination. Additionally, ethical considerations and value choices may be embedded into algorithms. Workforce Replacement o Integrating AI into the workforce brings uncertainty and challenge to the labor market, e.g., concerns regarding the extent to which AI will replace jobs. Business leaders and 10 Sophie Nightingale and Hany Ford, AI-synthesized faces are indistinguishable from real faces and more trustworthy, Proceedings of the National Academy of Sciences of the United States of America (Feb. 14, 2022), https://www.pnas.org/doi/epdf/10.1073/pnas.2120481119 (last visited Feb. 16, 2024). 11 National Conference of State Legislatures (NCSL), Approaches to Regulating Artificial Intelligence: A Primer, Aug. 10, 2023, https://www.ncsl.org/technology-and-communication/approaches-to-regulating-artificial-intelligence-a-primer (last visited Feb. 16, 2024). 12 Id.; Darrell West and John Allen, How artificial intelligence is transforming the world, Brookings Institute, Apr. 24, 2018, https://www.brookings.edu/articles/how-artificial-intelligence-is-transforming-the-world/ (last visited Feb. 16, 2024). 13 NCSL, supra note 11. STORAGE NAME: h1459c.JDC PAGE: 4 DATE: 2/19/2024 governments may need to make significant investments in retraining and reskilling the workforce. Legal Liability o There are questions concerning who is legally liable when AI systems harm or discriminate against people, especially as new and emerging uses for AI platforms are developed and integrated. Security Risks 14 o AI systems present cybersecurity and national security risks, due to: AI companies collecting large amounts of personal data for AI training and use. The potential for bad actors to use AI to develop advanced cyberattacks, bypass security measures, and exploit vulnerabilities in various private and public systems. o Traditional cybersecurity risk assessment tools are generally inadequate for addressing risks associated with AI. Efforts to Regulate Artificial Intelligence As the use of AI technologies has grown, so too have discussions of whether and how to regulate them. Potential regulatory options include a broad regulation of AI technologies that could be used across sectors, or a more targeted approach, regulating the use of AI technologies in particular sectors. 15 In 2023, 31 states introduced at least 191 bills concerning AI, with 14 bills becoming laws. 16 As reported by the National Conference of State Legislatures: 17 Connecticut required the state’s Department of Administrative Services to conduct an inventory of all systems employing AI that are in use by any state agency and, beginning February 1, 2024, to perform ongoing assessments of such systems to ensure that the use of any such system does not result in unlawful discrimination or disparate impact. Louisiana adopted a resolution requesting its Joint Legislative Committee on Technology and Cybersecurity to study the impact of AI in operations, procurement, and policy. Maryland established the Industry 4.0 Technology Grant Program to assist certain small and medium-sized manufacturing enterprises with implementing new “industry 4.0” technology or related infrastructure. The definition of industry 4.0 includes AI. Texas, North Dakota, Puerto Rico, and West Virginia created AI advisory councils to study and monitor AI systems developed, employed, or procured by state agencies. Additionally, the following laws were passed in previous years: California prohibits any person from using a bot to communicate or interact with another person online with the intent to mislead the other person about its artificial identity in order to incentivize a purchase or sale of goods or services in a commercial transaction or to influence a vote in an election. 18 Illinois requires an employer that asks applicants to record video interviews and uses an AI analysis of applicant-submitted videos to: 19 o Notify each applicant in writing before the interview that AI may be used to analyze the applicant's facial expressions and consider the applicant's fitness for the position; o Provide each applicant with an information sheet before the interview explaining how the AI works and what characteristics it uses to evaluate applicants; and o Obtain written consent from the applicant to be evaluated by the AI program. 14 Id; Bernard Marr, The 15 Biggest Risks Of Artificial Intelligence, Forbes, Jun. 2, 2023, https://www.forbes.com/sites/bernardmarr/2023/06/02/the- 15-biggest-risks-of-artificial-intelligence/?sh=603d66292706 (last visited Feb. 16, 2024). 15 CRS, supra note 2. 16 NCSL, State of Play | An Inside Look at Artificial Intelligence Policy and State Actions, Jan. 9, 2024, https://www.ncsl.org/state-legislatures- news/details/state-of-play-an-inside-look-at-artificial-intelligence-policy-and-state-actions (last visited Feb. 16, 2024). 17 NCSL, Artificial Intelligence 2023 Legislation, Jan. 12, 2024, https://www.ncsl.org/technology-and-communication/artificial-intelligence-2023- legislation (last visited Feb. 16, 2024). 18 Cal. B&P Code §§ 17940-17943. 19 2019 IL Public Act 101-0260. STORAGE NAME: h1459c.JDC PAGE: 5 DATE: 2/19/2024 While there is no broad framework for AI regulation in the United States, federal laws on AI have been enacted over the past few years to guide actions within the federal government. For example, the National Artificial Intelligence Initiative Act of 2020 establishes the American AI Initiative and provides directions for AI research, development, and evaluation activities at federal science agencies. 20 Globally, the European Union has proposed the Artificial Intelligence Act (AIA), which would create broad regulatory oversight for the development and use of a wide range of AI applications, with requirements varying by risk category, from banning systems with unacceptable risk to allowing free use of those with minimal or no risk. 21 In an effort to begin implementation of the AIA, a related new rule was agreed to in December 2023, which includes requiring human oversight in creating and deploying the systems and banning indiscriminate scraping of images from the internet to create a facial recognition database. 22 Artificial Intelligence Used to Create Child Pornography Recently, there has been an increase in AI production of child pornography. Offenders may use downloadable open source GenAI and GAN models, which can produce images quickly, to devastating effects. 23 Hidden inside the foundation of some popular AI image-generators are thousands of images of child sexual abuse, which have made it easier for offenders and AI systems to produce realistic and explicit imagery of fake children as well as transform social media photos of fully clothed children into child sexual abuse material (CSAM). 24 In September 2023, analysts at the Internet Watch Foundation (IWF) 25 found in one dark web CSAM forum, a total of 20,254 AI-generated photos posted within the prior month. The analysts spent 87.5 hours assessing 11,108 of these images. Half of the images were found to be indecent, with 564 photos showing the most severe types of abuse. 26 Additionally, the Stanford Internet Observatory recently found more than 3,200 images of suspected child sexual abuse in the giant AI database LAION, an index of online images and captions that’s been used to train leading AI generators. 27 Nishant Vishwamitra, an assistant professor at the University of Texas at San Antonio who is working on the detection of deepfakes and AI-generated CSAM images online, stated that “the scale at which such images can be created is worrisome.” 28 Child Pornography Laws Federal Law Generally, the First Amendment does not protect child pornography. In New York v. Ferber, 29 the United States Supreme Court (Supreme Court) recognized that states have a compelling interest in 20 CRS, supra note 2. 21 Id; European Commission, Regulatory Framework Proposal on Artificial Intelligence, https://digital-strategy.ec.europa.eu/en/policies/regulatory- framework-ai (last visited Feb. 16, 2024). 22 Adam Satariano, E.U. Agrees on Landmark Artificial Intelligence Rules, NY Times, Dec. 8, 2023, https://www.nytimes.com/2023/12/08/technology/eu-ai-act-regulation.html (last visited Feb. 16, 2024). 23 Matt Burgess, The AI-Generated Child Abuse Nightmare Is Here, Wired, Oct. 24, 2023, https://www.wired.com/story/generative-ai-images-child- sexual-abuse/ (last visited Feb. 16, 2024). 24 Matt O’Brien and Haleluya Hadero, Study shows AI image-generators are being trained on explicit photos of children, PBS NewsHour, Dec. 20, 2023, https://www.pbs.org/newshour/science/study-shows-ai-image-generators-are-being-trained-on-explicit-photos-of-children (last visited Feb. 16, 2024). 25 A nonprofit organization based in the UK that scours and removes abuse content from the web. Supra, note 23. 26 Id. 27 O’Brien and Hadero, supra note 24. 28 Id. 29 458 U.S. 747 (1982). STORAGE NAME: h1459c.JDC PAGE: 6 DATE: 2/19/2024 safeguarding the physical and psychological well-being of minors and in preventing their sexual exploitation and abuse. The Supreme Court noted that it was “unlikely that visual depictions of children . . . lewdly exhibiting their genitals would often constitute an important and necessary part of a literary performance or scientific or educational work.” 30 Under these principles, states have criminalized possessing, distributing, and other acts involving child pornography. Additionally, many federal courts have held that morphed child pornography, which is created when an innocent image of a child is combined with a separate, sexually explicit image, usually of an adult, is not protected expressive speech under the First Amendment. 31 For instance, in United States v. Bach, the defendant was convicted of possessing morphed child pornography. The image at issue showed a young nude boy sitting in a tree, grinning, with his pelvis tilted upward, his legs opened wide, and a full erection. The photograph of a well-known child entertainer’s head had been “skillfully inserted onto the photograph of the nude boy so that the resulting image appeared to be a nude picture of the child entertainer sitting in the tree.” The defendant appealed, arguing that his conviction was invalid because the definition of morphed child pornography violated the First Amendment. The court disagreed, holding that morphed child pornography “implicate[s] the interests of real children” and creates a lasting record of an identifiable minor child seemingly engaged in sexually explicit activity. 32 In 2014, in United States v. Anderson, 33 the defendant was charged with distribution of morphed child pornography relating to an image in which the face of a minor female was superimposed over the face of an adult female engaging in sex with an adult male. The defendant moved to dismiss the charge, arguing that the definition of morphed child pornography was unconstitutionally overbroad. The court noted that in the image at issue “no minor was sexually abused.” 34 However, the court held that because such images falsely portray identifiable children engaging in sexual activity, they implicate the compelling governmental interest in protecting minors. 35 Using this reasoning, the court applied a strict scrutiny balancing test and held that the definition of morphed child pornography was constitutional as applied to the facts of Anderson. To date, the federal statutes relating to morphed child pornography have been upheld. 36 Child Pornography Prevention Action of 1996 Prior to 1996, federal law criminalized a variety of acts relating to child pornography. 37 At that time, federal statutes described images of a minor actually engaging in sexually explicit conduct. 38 In 1996, Congress passed the Child Pornography Prevention Action of 1996 (CPPA), 39 creating a definition of “child pornography” that for the first time criminalized acts relating to virtual child pornography. In 2002, the Supreme Court decided Ashcroft v. Free Speech Coalition, 40 a case in which a California trade association for the adult entertainment industry challenged the CPPA as unconstitutionally overbroad. One provision of the CPPA prohibited “any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, that is or appears to be, of a minor engaging in sexually explicit conduct.” This prohibition did not depend at all on how the depiction was produced and did not require the use of an image of a real child to create the depiction. The provision captured a range of depictions, referred to as “virtual child pornography,” which may include wholly 30 Id. at 762-63. 31 U.S. v. Hotaling, 634 F.3d 725, 728 (2d Cir. 2011). 32 400 F.3d 622, 632 (8th Cir. 2005). 33 759 F.3d 891 (8th Cir. 2014). 34 Id. at 895. 35 Id. at 896. 36 United States v. Ramos, 685 F.3d 120, 134 (2d Cir. 2012), cert. denied, 133 S.Ct. 567 (2012); see also Doe v. Boland, 630 F.3d 491, 497 (6th Cir. 2011); see also United States v. Hotaling, 634 F.3d 725 (2d Cir. 2008), cert. denied, 132 S.Ct. 843 (2011) (citing Bach, the Court held that “child pornography created by digitally altering sexually explicit photographs of adults to display the face of a child is not protected expressive speech under the First Amendment.). 37 See, e.g., 18 USC §2252 (1994 ed.). 38 U.S. v. Hotaling, 599 F.Supp.2d 306, 309 (N.D.N.Y. 2008); see also 18 USC §§ 2252 and 2256 (1994 ed.). 39 Pub. L. No. 104-208. 40 535 U.S. 234 (2002). STORAGE NAME: h1459c.JDC PAGE: 7 DATE: 2/19/2024 computer-generated images, as well as images produced by more traditional means. 41 The Supreme Court held that the speech criminalized in the challenged provision of the CPPA violated the First Amendment since it extended the federal prohibition against child pornography to sexually explicit images that “appeared to” depict minors but were “produced without using any real children.” 42 The Supreme Court decided that “by prohibiting child pornography that did not depict an actual child,” the CPPA “abridged the freedom to engage in a substantial amount of lawful speech” and was therefore overbroad and unconstitutional. 43 Congress attempted to remedy the constitutional issues raised in Ashcroft by passing the Prosecutorial Remedies and Other Tools to end the Exploitation of Children Today Act (Protect Act) in 2003. 44 The Protect Act narrowed the definition of virtual child pornography in the CPPA to prohibit a visual depiction that is a digital image, computer image, or computer-generated image that is, or is indistinguishable from, that of a minor engaging in sexually explicit conduct. Additionally, the Act defined “indistinguishable” to mean, when used with respect to a depiction, virtually indistinguishable, in that the depiction is such that an ordinary person viewing the depiction would conclude that the depiction is of an actual minor engaged in sexually explicit conduct. This definition does not apply to depictions that are drawings, cartoons, sculptures, or paintings depicting minors or adults. 45 Florida Law Section 827.01, F.S., defines: “Child pornography” as: o Any image depicting a minor engaged in sexual conduct; or o Any image that has been created, altered, adapted, or modified by electronic, mechanical, or other means, to portray an identifiable minor engaged in sexual conduct. 46 “Sexual conduct” as: o Actual or simulated 47 sexual intercourse, deviate sexual intercourse, sexual bestiality, 48 masturbation, or sadomasochistic abuse; 49 o Actual or simulated lewd exhibition of the genitals; o Actual physical contact with a person’s clothed or unclothed genitals, pubic area, buttocks, or, if such person is a female, breast, with the intent to arouse or gratify the sexual desire of either party; or o Any act or conduct which constitutes sexual battery 50 or simulates that sexual battery is being or will be committed. 51, 52 “Identifiable minor” as a person: o Who was a minor at the time the image was created, altered, adapted, or modified, or whose image as a minor was used in the creating, altering, adapting, or modifying of the image; and 41 Ashcroft, 535 U.S. at 241. 42 Id. at 239. 43 Id. at 256. 44 Pub. L. No. 108-21. 45 18 USC §2256(8)(B) and (11). 46 S. 827.071(1)(b), F.S. 47 “Simulated” means the explicit depiction of sexual conduct which creates the appearance of such conduct and which exhibits any uncovered portion of the breasts, genitals, or buttocks. S. 827.071(1)(n), F.S. 48 “Sexual bestiality” means any sexual act between a person and an animal involving the sex organ of the one and the mouth, anus, or female genitals of the other. S. 827.071(1)(k), F.S. 49 “Sadomasochistic abuse” means flagellation or torture by or upon a person, or the condition of being fettered, bound, or otherwise physically restrained, for the purpose of deriving sexual satisfaction from inflicting harm on another or receiving such harm oneself. S. 827.071(1)(i), F.S. 50 “Sexual battery” means oral, anal, or female genital penetration by, or union with, the sexual organ of another or the anal or female genital penetration of another by any other object; however, “sexual battery” does not include an act done for a bona fide medical purpose. S. 827.071(1)(j), F.S. 51 S. 827.071(1)(l), F.S. 52 A mother’s breastfeeding of her baby does not under any circumstance constitute “sexual conduct.” Id. STORAGE NAME: h1459c.JDC PAGE: 8 DATE: 2/19/2024 o Who is recognizable as an actual person by the person’s face, likeness, or other distinguishing characteristic, such as a unique birthmark, or other recognizable feature. 53, 54 Florida law contains a variety of provisions prohibiting acts relating to child pornography, including: Section 827.071(4), F.S., which prohibits a person from possessing with the intent to promote any photograph, motion picture, exhibition, show, representation, or other presentation which, in whole or in part, includes child pornography, as a second-degree felony. Possession of three or more copies of such photographs, etc., is prima facie evidence of a person’s intent to promote. Section 827.071(5), F.S., which prohibits a person from knowingly possessing, controlling, or intentionally viewing 55 a photograph, motion picture, or other image that, in whole or in part, he or she knows includes any child pornography, as a third-degree felony. 56 Section 847.0137, F.S., which prohibits a person from knowingly, or under circumstances when he or she reasonably should have known, transmitting child pornography to another person, as a third-degree felony. While the definition of “child pornography” in Florida law currently captures morphed child pornography, it does not capture virtual child pornography. As such, Florida law does not currently prohibit the production, possession, control, intentional viewing, promotion, or transmission of an image that, although not containing or being derived from an image of a real minor, is indistinguishable from an image of a real minor engaging in sexual conduct. Advisory Councils Section 20.03, F.S., defines an “advisory council” as an advisory body created by specific statutory enactment and appointed to function on a continuing basis. Generally, an advisory council is enacted to study the problems arising in a specified functional or program area of state government and to provide recommendations and policy alternatives. 57 The Code of Ethics for Public Officers and Employees 58 establishes ethical standards for public officials, which includes any person elected or appointed to hold office in any agency and any person serving on an advisory council. 59 The code is intended to ensure that public officials conduct themselves independently and impartially, and do not use their offices for private gain other than compensation provided by law. The code pertains to various ethical issues, such as ethics trainings, voting conflicts, full and public disclosure of financial interests, and standards of conduct. 60 Florida Cybersecurity Advisory Council The Department of Management Services (DMS) oversees information technology (IT) 61 governance and security for the executive branch in Florida. 62 The Florida Digital Service (FLDS) is housed within 53 S. 827.071(1)(e), F.S. 54 The term may not be construed to require proof of the actual identity of the identifiable minor. Id. 55 “Intentionally view” means to deliberately, purposefully, and voluntarily view. Proof of intentional viewing requires establishing more than a single image, motion picture, exhibition, show, image, data, computer depiction, representation, or other presentation was viewed over any period of time. S. 827.071(1)(b), F.S. 56 The statute also specifies that the possession, control, or intentional viewing of each such photograph, or other image, is a separate offense. If such photograph or other image includes child pornography depicting more than one child, then each child in each photograph or image that is knowingly possessed, controlled, or intentionally viewed is a separate offense. 57 S. 20.03(7), F.S.; See also s. 20.052, F.S. 58 See Part III, Chapter 112, F.S. 59 S. 112.313(1), F.S. 60 See Part III, Chapter 112, F.S. 61 The term “information technology” means equipment, hardware, software, firmware, programs, systems, networks, infrastructure, media, and related material used to automatically, electronically, and wirelessly collect, receive, access, transmit, display, store, record, retrieve, analyze, evaluate, process, classify, manipulate, manage, assimilate, control, communicate, exchange, convert, converge, interface, switch, or disseminate information of any kind or form. S. 282.0041(19), F.S. 62 See s. 20.22, F.S. STORAGE NAME: h1459c.JDC PAGE: 9 DATE: 2/19/2024 DMS and was established in 2020 to replace the Division of State Technology. 63 FLDS works under DMS to implement policies for IT and cybersecurity for state agencies. 64 The Florida Cybersecurity Advisory Council (CAC) is an advisory council within DMS. 65 The purpose of the CAC is to assist state agencies in protecting IT resources from cybersecurity threats and incidents and advise counties and municipalities on cybersecurity. 66 The CAC must assist FLDS in implementing best cybersecurity practices. 67 The CAC meets at least quarterly to: Review existing state agency cybersecurity policies; Assess ongoing risks to state agency IT; Recommend a reporting and information sharing system to notify state agencies of new risks; Recommend data breach simulation exercises; Assist FLDS in developing cybersecurity best practice recommendations; Examine inconsistencies between state and federal law regarding cybersecurity; Review information relating to cybersecurity incidents and ransomware incidents to determine commonalities and develop best practice recommendations for state agencies, counties, and municipalities; and Recommend any additional information that a county or municipality should report to the FLDS as part of its cybersecurity incident or ransomware incident notification under s. 282.3185, F.S. 68 The CAC must work with the National Institute of Standards and Technology 69 and other federal agencies, private sector businesses, and private cybersecurity experts to identify which local infrastructure sectors, not covered by federal law, are at the greatest risk of cyber-attacks and need the most enhanced cybersecurity measures and to identify categories of critical infrastructure as critical cyber infrastructure if cyber damage to the infrastructure could result in catastrophic consequences. 70 The CAC must also prepare and submit a comprehensive report to the Governor, the President of the Senate, and the Speaker of the House of Representatives that includes data, trends, analysis, findings, and recommendations for state and local action regarding ransomware incidents, including: Descriptive statistics including the amount of ransom requested, the duration of the ransomware incident, and the overall monetary cost to taxpayers of the ransomware incident. A detailed statistical analysis of the circumstances that led to the ransomware incident which does not include the name of the state agency, county, or municipality; network information; or system identifying information. A detailed statistical analysis of the level of cybersecurity employee training and frequency of data backup for the state agency, county, or municipality that reported the ransomware incident. Specific issues identified with current policies, procedures, rules, or statutes and recommendations to address such issues. Any other recommendations to prevent ransomware incidents. Florida Deceptive and Unfair Trade Practices Act (FDUTPA) 63 Ch. 2020-161, L.O.F. 64 See s. 20.22(2)(b), F.S. 65 S. 282.319(1), F.S. 66 S. 282.319(2), F.S. 67 S. 282.319(3), F.S. 68 S. 282.319(9), F.S. 69 The National Institute of Standards and Technology (NIST) is a non-regulatory federal agency housed within the United States Department of Commerce. NIST’s role is to facilitate and support the development of cybersecurity risk frameworks. NIST is charged with providing a prioritized, flexible, repeatable, performance-based, and cost-effective approach, including information security measures and controls that may be voluntarily adopted by owners and operators of critical infrastructure to help them identify, assess, and manage cyber risks. NIST, NIST General Information, https://www.nist.gov/director/pao/nist-general-information (last visited Feb. 16, 2024); NIST, Framework for Improving Critical Infrastructure Cybersecurity, p. 1, https://nvlpubs.nist.gov/nistpubs/CSWP/NIST.CSWP.04162018.pdf (last visited Feb. 16, 2024). 70 S. 282.319(10), F.S. STORAGE NAME: h1459c.JDC PAGE: 10 DATE: 2/19/2024 FDUTPA is a consumer and business protection measure that prohibits unfair methods of competition and unconscionable, deceptive, or unfair acts or practices in the conduct of trade or commerce. 71 FDUTPA was modeled after the Federal Trade Commission Act. 72 The Department of Legal Affairs (DLA) or an Office of the State Attorney (SAO) may bring actions on behalf of consumers or governmental entities when it serves the public interest. 73 The SAO may enforce violations of FDUTPA if the violations take place within its jurisdiction. The DLA has enforcement authority when the violation is multi-jurisdictional, the state attorney defers to the DLA in writing, or the state attorney fails to act within 90 days after a written complaint is filed. 74 In certain circumstances, consumers may also file suit through private actions. 75 The DLA and the SAO have powers to investigate FDUTPA claims, which include: 76 Administering oaths and affirmations; Subpoenaing witnesses or matter; and Collecting evidence. The DLA and the State Attorney, as enforcing authorities, may seek the following remedies: Declaratory judgments; Injunctive relief; Actual damages on behalf of consumers and businesses; Cease and desist orders; and Civil penalties of up to $10,000 per willful violation. 77 FDUTPA may not be applied to certain entities in certain circumstances, including: 78 • Any person or activity regulated under laws administered by the Office of Insurance Regulation or the Department of Financial Services; or • Banks, credit unions, and savings and loan associations regulated by the Office of Financial Regulation or federal agencies. Effect of Proposed Changes Government Technology Modernization Council CS/CS/HB 1459 creates s. 282.802, F.S., to establish the Government Technology Modernization Council (council) to serve as an advisory council within DMS. The bill provides that the purpose of the council is to study and monitor the development and deployment of new technologies and provide reports on recommendations for procurement and regulation of such systems to the Governor and the Legislature. The bill requires the council to meet at least quarterly to: 71 Ch. 73-124, L.O.F.; s. 501.202, F.S. 72 D. Matthew Allen, et. al., The Federal Character of Florida’s Deceptive and Unfair Trade Practices Act, 65 U. MIAMI L. REV. 1083 (Summer 2011). 73 S. 501.207(1)(c) and (2), F.S.; see s. 501.203(2), F.S. (defining “enforcing authority” and referring to the office of the state attorney if a violation occurs in or affects the judicial circuit under the office’s jurisdiction; or the Department of Legal Affairs if the violation occurs in more than one circuit; or if the office of the state attorney defers to the department in writing; or fails to act within a specified period); see also David J. Federbush, FDUTPA for Civil Antitrust: Additional Conduct, Party, and Geographic Coverage; State Actions for Consumer Restitution, 76 Florida Bar Journal 52, Dec. 2002 (analyzing the merits of FDUPTA and the potential for deterrence of anticompetitive conduct in Florida), http://www.floridabar.org/divcom/jn/jnjournal01.nsf/c0d731e03de9828d852574580042ae7a/99aa165b7d8ac8a485256c8300791ec1!OpenDocument &Highlight=0,business,Division* (last visited on Feb. 16, 2024). 74 S. 501.203(2), F.S. 75 S. 501.211, F.S. 76 S. 501.206(1), F.S. 77 Ss. 501.207(1), 501.208, and 501.2075, F.S. Civil Penalties are deposited into the General Revenue Fund. Enforcing authorities may also request attorney fees and costs of investigation or litigation. S. 501.2105, F.S. 78 S. 501.212(4), F.S. STORAGE NAME: h1459c.JDC PAGE: 11 DATE: 2/19/2024 Recommend legislative and administrative actions that the Legislature and state agencies may take to promote the development of data modernization in Florida. Assess and provide guidance on necessary legislative reforms and the creation of a state code of ethics for AI systems in state government. Assess the effect of automated decision systems or identity management on constitutional and other legal rights, duties, and privileges of residents of this state. Evaluate common standards for AI safety and security measures, including the benefits of requiring disclosure of the digital provenance for all images and audio created using generative AI as a means of revealing the origin and edit of the image or audio, as well as the best methods for such disclosure. Assess how governmental entities and the private sector are using AI with a focus on opportunity areas for deployments in systems across this state. Determine how AI is being exploited by bad actors, including foreign countries of concern. 79 Evaluate the need for curriculum to prepare school-age audiences with the digital media and visual literacy skills needed to navigate the digital information landscape. The bill requires the council to annually submit any legislative recommendations it considers necessary to modernize government technology to the Governor, the President of the Senate, and the Speaker of the House of Representatives beginning June 30, 2024. The bill requires such recommendations to include any information the council considers relevant, including policies necessary to: Accelerate adoption of technologies that will increase productivity of state enterprise information technology systems, improve customer service levels of government, and reduce administrative or operating costs. Promote the development and deployment of AI systems, financial technology, education technology, or other enterprise management software in Florida. Protect Floridians from bad actors who use AI. The bill requires the council to be comprised of the following members: The Lieutenant Governor. The state chief information officer. The Secretary of Commerce. The Secretary of Health Care Administration. The Commissioner of Education. Seven representatives with senior level experience or expertise in AI, cloud computing, identity management, data science, machine learning, government procurement, financial technology, educational technology, and constitutional law, with five appointed by the Governor, one appointed by the President of the Senate, and one appointed by the Speaker of the House of Representatives. One member of the Senate, appointed by the President of the Senate, or his or her designee. One member of the House of Representatives, appointed by the Speaker of the House of Representatives, or his or her designee. The Secretary of DMS, or his or her designee, who shall serve as the ex officio, nonvoting executive director of the council. The bill provides that council members shall serve for terms of four years, except that sitting members of the Senate and the House of Representatives shall serve terms that correspond with their terms of office. For the purpose of providing staggered terms, the initial appointments of members made by the Governor are for terms of two years. Under the bill, a vacancy is filled for the remainder of the unexpired term in the same manner as the initial appointment. All members of the council are eligible for reappointment. 79 Section 287.138(1), F.S., lists the following countries as foreign countries of concern: the People’s Republic of China, the Russian Federation, the Islamic Republic of Iran, the Democratic People’s Republic of Korea, the Republic of Cuba, the Venezuelan regime of Nicolás Maduro, and the Syrian Arab Republic. STORAGE NAME: h1459c.JDC PAGE: 12 DATE: 2/19/2024 The bill provides that members of the council shall serve without compensation, but are entitled to receive reimbursement for per diem and travel expenses. 80 The bill requires members of the council to maintain the confidential and exempt status of information received in the performance of their duties and responsibilities. A current or former member of the council must follow the Code of Ethics for Public Officers and Employees, and may not disclose or use information not available to the general public and gained by reason of his or her official position, except for information relating exclusively to governmental practices, for his or her personal gain or benefit or for the personal gain or benefit of any other person or business entity. Members of the council must sign an agreement acknowledging such requirements. Artificial Intelligence Transparency The bill creates s. 501.174, F.S., to establish certain requirements related to AI transparency. The bill defines "artificial intelligence" as software that is developed with machine-learning, logic and knowledge-based, or statistical approaches and can, for a given set of human-defined objectives, generate or synthesize outputs such as content, predictions, recommendations, or decisions influencing certain environments. The bill requires an entity or person who produces or offers for use or interaction AI content or technology for a commercial purpose, and makes such content or technology available to the Florida public, to create safety and transparency standards that: Alert consumers that such content or technology is generated by AI. Allow such content or technology to be recognizable as generated by AI to other AI. If a natural person in Florida is able to communicate or interact with an entity or person for commercial purposes through an AI mechanism, the bill requires such entity or person to provide a clear and conspicuous statement on its Internet homepage or landing page indicating that such mechanism is generated by AI. The bill prohibits any entity or person from knowingly producing, generating, incorporating, or synthesizing child pornography through AI using an image of an identifiable child. The bill requires any state agency 81 that uses AI to disclose if a person is interacting with AI when interacting with the agency and to ensure that any confidential information accessible to an AI system remains confidential. Under the bill, any violation of the AI transparency requirements by a person or entity is considered an unfair and deceptive trade practice actionable under FDUTPA solely by DLA . 82 In addition to other FDUTPA remedies, the bill authorizes DLA to collect a civil penalty of up to $50,000 per violation. The bill authorizes DLA to adopt rules to implement the bill. The bill does not establish a private cause of action. For purposes of being subject to the jurisdiction of the courts in this state related to an action for a violation of AI transparency standards, the bill specifies that any entity or person who produces or uses AI that is distributed to or viewable by the public in this state is considered to be both engaged in substantial and not isolated activities within this state and operating, conducting, engaging in, or carrying on a business, and doing business in this state. 80 As allowed under s. 112.061, F.S. 81 As defined in s. 282.318(2), which is any official, officer, commission, board, authority, council, committee, or department of the executive branch of state government; the Justice Administrative Commission; and the Public Service Commission. The term does not include university boards of trustees or state universities. 82 Unlike under general FDUTPA actions, DLA is not prohibited from bringing an action against a social media platform that is also a: Person or activity regulated under laws administered by OIR or DFS; and Bank, credit union, and savings and loan association regulated by OFR or federal agencies. STORAGE NAME: h1459c.JDC PAGE: 13 DATE: 2/19/2024 Child Pornography The bill amends ss. 775.0847 and 827.071, F.S., to expand the definition of “child pornography” to include any image or presentation produced, generated, incorporated, or synthesized through artificial intelligence that uses an image of an identifiable minor to depict or portray a minor engaged in sexual conduct, thereby prohibiting the production, possession, control, intentional viewing, promoting, or transmitting of such an image as a criminal offense. The bill provides an effective date of July 1, 2024. B. SECTION DIRECTORY: Section 1: Creates s. 282.802, F.S., relating to the Government Technology Modernization Council. Section 2: Creates s. 501.174, F.S., relating to artificial intelligence transparency. Section 3: Amends s. 775.0847, F.S., relating to possession or promotion of certain images of child pornography; reclassification. Section 4: Amends s. 827.071, F.S., relating to sexual performance by a child; child pornography; penalties. Section 5: Provides an effective date of July 1, 2024. II. FISCAL ANALYSIS & ECONOMIC IMPACT STATEMENT A. FISCAL IMPACT ON STATE GOVERNMENT: 1. Revenues: The bill may have an indeterminate positive impact on DLA due to an increase in civil penalties collected for violations of the AI transparency requirements. 2. Expenditures: The bill may have an indeterminate negative impact on DMS to the extent that it requires new additional expenditures by DMS to create and run the Government Technology Modernization Council. Additionally, the bill may have an indeterminate negative impact on DLA due to expenditures required to enforce the AI transparency requirements. See Fiscal Comments. B. FISCAL IMPACT ON LOCAL GOVERNMENTS: 1. Revenues: None. 2. Expenditures: See Fiscal Comments. C. DIRECT ECONOMIC IMPACT ON PRIVATE SECTOR: The bill may have an indeterminate negative impact on the private sector as it requires entities that use AI in certain circumstances to provide certain disclaimers which may require additional expenditures to develop and employ. D. FISCAL COMMENTS: Based on the provisions of the bill, DMS will likely incur the following recurring costs that can be absorbed by existing resources: 1. Administrative support staff. 2. AI subject matter experts. STORAGE NAME: h1459c.JDC PAGE: 14 DATE: 2/19/2024 3. Travel expenses for council members and administration staff. 4. Policy analyst staff for drafting annual legislative recommendations. III. COMMENTS A. CONSTITUTIONAL ISSUES: 1. Applicability of Municipality/County Mandates Provision: Not applicable. This bill does not appear to affect county or municipal governments. 2. Other: The First Amendment to the U.S. Constitution guarantees that “Congress shall make no law ... abridging the freedom of speech.” 83 Generally, “government has no power to restrict expression because of its message, its ideas, its subject matter, or its content.” 84 The rights guaranteed by the First Amendment apply with equal force to state governments through the due process clause of the Fourteenth Amendment. 85 As a general rule, pornography can only be banned if obscene, however, in New York v. Ferber, 86 the Supreme Court held that pornography showing minors can be proscribed whether or not the images are obscene under the definition set forth in Miller. 87 The Supreme Court held that the Miller standard does not reflect a state’s particular and more compelling interest in prosecuting those who promote the sexual exploitation of children, and that where the images are themselves the product of child sexual abuse a state has an interest in stamping it out without regard to any judgment about its content. 88 Additionally, while the Supreme Court has not resolved whether the First Amendment protects morphed pornography, it has noted that using photos of identifiable minors to make it appear they are engaged in sexual acts implicates the interests of real children and in that sense are closer to real child pornography. 89 To date, the federal statutes relating to morphed child pornography have been upheld. 90 However, the Supreme Court has held that virtual pornography (i.e. sexually explicit conduct created by using advanced computer imaging techniques to create realistic images of children who do not exist) is not “intrinsically related” to the sexual abuse of children. And, unlike real child pornography, which results in injury to the child’s reputation and emotional well-being, no child is involved in the creation of virtual pornography. 91 While the Supreme Court has struck down as unconstitutional prohibitions on a visual depiction or computer-generated image or picture that appears to be of a minor engaging in sexually explicit conduct, it has not yet determined whether a more narrow prohibition on a visual depiction that is a digital image, computer image, or computer-generated image that is indistinguishable from that of a minor engaging in sexually explicit conduct is constitutional under the First Amendment. As such, the bill’s expansion of the definition of “child pornography” to include any image or 83 U.S. Const., amend. I. 84 Police Dept. of City of Chicago v. Mosley, 408 U.S. 92, 95 (1972). 85 U.S. Const. amend. XIV. See also Art. I, Fla. Const. 86 458 U.S. 747 (1982). 87 Miller v. California, 413 U.S. 15 (1973)(The Miller test considers whether the average person, applying contemporary community standards, would find that the work, taken as a whole, appeals to the prurient interests and that the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by applicable state law; and whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.). 88 Ashcroft, 535 U.S. at 240 and 249. 89 United States v. Mecham, 950 F.3d 257, 263 (5th Cir. 2020). 90 United States v. Ramos, 685 F.3d 120, 134 (2d Cir. 2012), cert. denied, 133 S.Ct. 567 (2012); see also Doe v. Boland, 630 F.3d 491, 497 (6th Cir. 2011); see also United States v. Hotaling, 634 F.3d 725 (2d Cir. 2008), cert. denied, 132 S.Ct. 843 (2011) (citing Bach, the Court held that “child pornography created by digitally altering sexually explicit photographs of adults to display the face of a child is not protected expressive speech under the First Amendment.). 91 Mecham, 950 F.3d at 263. STORAGE NAME: h1459c.JDC PAGE: 15 DATE: 2/19/2024 presentation produced, generated, incorporated, or synthesized through artificial intelligence that uses an image of an identifiable minor to depict or portray a minor, regardless of whether the minor is identifiable or not, engaged in sexual conduct may implicate the First Amendment. B. RULE-MAKING AUTHORITY: The bill authorizes DLA to adopt rules related to enforcing provisions related to AI transparency. C. DRAFTING ISSUES OR OTHER COMMENTS: On Line 116, the bill provides for submission of legislative recommendations by June 30, 2024, and each June 30 thereafter. The effective date of the bill is July 1, 2024. IV. AMENDMENTS/COMMITTEE SUBSTITUTE CHANGES On January 23, 2024, the Commerce Committee adopted a proposed committee substitute and reported the bill favorably as a committee substitute. The committee substitute changed the bill in the following ways: Removed provisions requiring certain permissions or disclosures for political advertisements produced and image and likeness used by AI, and conformed related provisions. Expanded the criminal definition of “child pornography” to include AI creations. Clarified language. Changed the enacting clause from “An act related to artificial intelligence transparency” to “An act relating to advanced technology”. On January 31, 2024, the Appropriations Committee adopted an amendment and reported the bill favorably as a committee substitute. The committee substitute changed the bill in the following ways: Revised the membership of the advisory council. Revised the responsibilities of the advisory council. Removed the provision requiring a comprehensive annual ransomware report. Added a provision for the advisory council to hold at least one joint quarterly meeting with the Cybersecurity Advisory Council. Added policy recommendations to include in the annual council submission of recommendations to the Governor, the President of the Senate, and the Speaker of the House of Representatives. This analysis is drafted to the committee substitute as passed by the Appropriations Committee.