Certain social media algorithms that target children prohibition
If passed, SF2101 would amend the Minnesota Statutes to place stringent restrictions on how social media platforms operate within the state. By enforcing these regulations, the bill aims to safeguard children from potentially harmful content and ensure that their online activity is monitored. The bill mandates that a minor must obtain permission from a legal guardian to create a social media account, thereby increasing parental control and accountability. Violations could lead to significant penalties, reinforcing compliance among social media operators.
SF2101 seeks to enhance consumer protection by imposing restrictions on the use of social media algorithms, specifically targeting content directed at individuals under the age of 18. The bill prohibits social media platforms with over 1 million users in Minnesota from utilizing algorithms that would tailor user-generated content for minors. It introduces strict liability measures against operators of these platforms if they knowingly allow minors to receive tailored content without parental consent, effectively giving guardians more oversight on their children's online interactions.
Notably, the bill has generated discussions about the balance between consumer protection and the operational freedoms of social media companies. Supporters advocate for the need to shield children from harmful influences on social media, emphasizing the importance of child safety online. Critics, however, argue that imposing such restrictions could stifle innovation within the tech industry and limit the accessibility of beneficial content for minors. Some experts express concerns about the practical implications of enforcing these rules, particularly regarding verification processes for guardianship and age-appropriateness of content.