Utah escalates war on social media, calling Snapchat a playground for predators

The state’s fourth lawsuit against Big Tech claims Snap’s addictive design, hidden data harvesting and AI chatbot put children in predators’ crosshairs.

Utah has entered into another legal battle against social media, this time going after Snap Inc., the owner of the social media platform Snapchat, which is predominantly more popular among younger generations.

“This, of all the cases, this one really matters,” Utah Attorney General Derek Brown told the Deseret News, “because this is where kids are.”

This lawsuit is the fourth brought by Utah’s Attorney General’s office and the Utah Department of Commerce, with the support of Utah Gov. Spencer Cox, in their efforts to safeguard children from online predators and social media addiction.

The state leaders are bringing three specific allegations against the photo/video platform, per the press release:

  • The app’s platform is designed to be addictive. It has harmful features embedded into its platform to “exploit children’s psychological vulnerabilities for financial gain, constituting an unconscionable business practice under state law.”
  • It’s marketed as a secure alternative to other social media apps for parents and children, thereby deceiving its users when it claims to protect them.
  • The app violates “the Utah Consumer Privacy Act by not informing consumers about its data collection and processing practices and failing to provide users or their parents with an opportunity to opt out of sharing sensitive data, such as biometric and geolocation information.”

Snap’s platform is unique among other major social media platforms due to the way content is shared. Since 2011, users have been sharing timered photos or videos “designed to delete by default,” according to Snapchat.

“This, along with other addictive and experimental features, induce Utah children to compulsively check the app,” the lawsuit claims. “Snapchat’s vanishing design feature has made it a favored tool for drug dealers and sexual predators targeting children” and gives “teens a false sense of security, leading them to believe their photos and messages disappear forever after being viewed, which encourages them to share riskier content” that could then be potentially exploited.

Brown said that his office’s priority is holding these companies accountable, a joint legislative effort among state leaders.

“We will do everything we can using the legal system to incentivize and encourage companies to take steps to protect kids,” he said. “And parents need to be very mindful of what’s taking place on social media, because a lot of the drug dealing, the extortion, the sexting, and a lot of the really problematic things that are taking place right now with our kids is focused not just on social media, but on Snapchat.

Margaret Busse, the executive director of Utah’s Department of Commerce, told the Deseret News that Utah is no exception in cases where adults prey on children via Snapchat.

“In 2021, a 27-year-old man from Salt Lake City groomed three young girls between the ages of 12 to 14 on Snapchat. He ultimately kidnapped them and sexually assaulted them,” Busse said. “In March 2023, a South Jordan man used a teen Snapchat account to lure a 13-year-old to his car, where he sexually assaulted her. In October 2024, a Riverton man was accused of sexually assaulting multiple victims, including minors that he found on Snapchat throughout Salt Lake City.”

And in 2023, Snapchat introduced the “My AI” feature, which has only heightened the safety concern, Busse said.

During investigation, per the lawsuit, tests showed that the AI Chatbot, which is powered by ChatGPT, gave a 15-year-old advice on how to hide the appearance of alcohol and marijuana from parents and even gave a 13-year-old recommendations on how to “set the mood” for a romantic night with a 31-year-old.

The only way the AI can be removed from the app is if a user has a paid subscription, so now parents need to be aware the children face potential dangers from both real-world and digital predators.

“If I’m the head of this company, and I understand how much my product is harming kids and how unsafe it is, why would I keep doing this?” Busse said.

“This is a choice companies make. It is not inevitable,” she said. “They could design a product with a very different business model, with very different features, that doesn’t have to be exploitative of our kids.”

Source: Utah News