Tech group: Michigan will be sued over efforts to restrain social media
A package of bills aiming to rein in how young Michiganders can interact with social media is drawing resistance from NetChoice, a trade group representing some of the sector’s biggest companies. (Photo courtesy of Bridge Michigan)
LANSING — Michigan lawmakers, seeking to rein in the influence of social media and artificial intelligence among children, may also draw the state into a nationwide legal battle with major technology companies.
Michigan bill sponsors and advocates for regulating social media companies cast the measures as a necessary public health intervention to protect children, while trade groups representing the tech sector and civil liberties organizations suggest the legislation could infringe on free speech and invade privacy.
At a state Senate committee hearing March 4, bereaved parents, advocates and teens pushed lawmakers to act on a package of bills, citing the alleged harm unfettered access to social media had wrought on youth mental health.
But NetChoice, an influential industry association that counts Amazon, Google, Meta and OpenAI among its members, also appeared at the hearing with a warning for lawmakers.
A bill aimed at blocking certain algorithms, late-night notifications, autoplaying videos and engagement-related features for children is “unconstitutional and will expose Michigan taxpayers to costly litigation,”
Bartlett Cleland, NetChoice’s general counsel, said in committee testimony.
Nancy Costello, a First Amendment expert and professor at the Michigan State University College of Law, challenged Cleland’s assertions.
“They are product designs aimed at maximizing business revenue for social media companies,” Costello said in her testimony. “This is not about speech. … This is about product liability law.”
Cleland, for his part, found Costello’s argument “shocking” and argued “it doesn’t hold water under the First Amendment.”
Costello said she “wouldn’t be surprised” if NetChoice sues in response because “they sue everybody else.”
For minors, the package aims to:
– Ban the use of “addictive algorithms” and design elements like infinite scroll, which sponsors say are tailored to keep users on the site.
– Require sites to place minors in the most stringent privacy settings by default.
– Ban large language model-driven chatbots, often called AI, from being accessible to minors if they encourage any one of a litany of damaging behaviors such as self-harm or suicide in conversations.
– Require age verification or estimation for users to access AI chatbots that could produce adult content.
Some of those features could be accessed by children, provided they get parental permission.
While it’s still early days for the legislative package — it would have to pass out of the Democrat-controlled state Senate and win approval in the GOP-majority House before it could be signed by Gov. Gretchen Whitmer — many facets of the package have been the subject of lawsuits after passing in other states. Thatindicates Michigan could be the latest front in a multi-state battle over lawmakers’ ability to control the aspects of the internet and social media it deems harmful to children.
Matt Hall, the Republican House Speaker, hasn’t signaled he’d support the package, but Rep. Mark Tisdale, a Rochester Republican who introduced his own social media age verification bill last year, said he would likely support the legislation and believes his colleagues would, too.
Lawsuits from NetChoice in multiple states have had mixed results in court. Cleland noted laws with what he called “identical constitutional defects” in California, Ohio and Arkansas have seen aspects at least temporarily blocked by federal courts and promised to lawmakers “Michigan’s bill will face the same fate.”
Lawmakers behind the bills, however, remain undeterred by the threat of litigation.
“It’s a common tactic used by organizations, especially those that have a lot of money and resources, anytime there’s regulation that they don’t want to have to deal with,” said Sen. Kevin Hertel, a St. Clair Shores Democrat who’s a sponsor of the package.
Hertel likened NetChoice’s aggressive approach to cigarette companies who resisted policies that sought to discourage their use decades ago.
“I think it’s just another example of big tech companies willing to take any avenue so they don’t have to have any regulations or guardrails,” he added.
Costello noted “the courts are just catching up to this” and “there’s not a plethora of caselaw,” but cited two examples in which social media algorithms aimed at maximizing engagement have seen legal scrutiny.
A provision in the California law restricting minors’ ability to access personalized feeds has survived a lawsuit from NetChoice, Costello said, because a federal appeals court chose to draw a distinction between different sorts of algorithms.
“An algorithm that responds solely to how users act online, merely giving them the content they appear to want, probably is not” speech protected by the First Amendment, Costello said.
Provisions in California’s law blocking adults from interacting with minors on some social media sites have thus far survived a lawsuit from NetChoice. The Ninth Circuit federal appellate court ruled shielding children from viewing “likes” or comment totals was “likely unconstitutional,” however, blocking that provision.
Social media companies are also largely protected from liability for the content users post on their platforms, thanks to a provision known as Section 230 in the 1996 federal Communications Decency Act. But Costello argued that doesn’t always shield companies from regulations over third-party content they feed to users.
In 2024, after a 10-year-old girl died from asphyxiation after coming across a “blackout challenge” through TikTok’s recommendation algorithm, a federal appellate panel ruled section 230 couldn’t shield them from litigation.
TikTok “didn’t just host it. They chose to use the algorithm to feed it, and that was a first-party choice,” Costello said.
Age-verification requirements in particular have appeared vulnerable to legal challenges. A Louisiana law requiring platforms to verify the age of users was blocked by a federal judge on constitutional grounds in December 2025, granting a permanent injunction against it.
“It’s a very, very difficult needle to thread to protect our First Amendment rights with regards to trying to protect kids to access things,” Kyle Zawacki, the legislative director for Michigan’s American Civil Liberties Union chapter previously told Bridge Michigan.
The US Supreme Court hasn’t directly taken up the issues yet, but Costello suggested that, with enough states enacting reforms, justices could be persuaded to review the issues.
“I think it’s a good idea to (pass the legislation), because that’s what creates law,” Costello added. “That’s what creates regulation.”






