School boards' lawyer suing social media platforms hopes trial reveals inner workings of algorithms

Studies link social media use among youth to mental health issues

School boards' lawyer suing social media platforms hopes trial reveals inner workings of algorithms
Duncan Embury, Neinstein Personal Injury Lawyers

One of the lawyers representing Ontario’s four largest school boards in a lawsuit against three social media companies says he hopes the documentary discovery process will reveal how the platforms’ algorithms were coded, designed to work, and the intentions behind their production, to better mitigate their psychological impact.

In the lawsuit recently filed against the companies behind TikTok, Snapchat, and Instagram, largest school boards allege these social media companies have imposed on teachers disruptive and unsafe changes in student behaviour through the design of their addictive and mental-health-damaging products.

The Toronto District School Board (TDSB), Peel District School Board (PDSB), Toronto Catholic District School Board (TCDSB), and Ottawa-Carleton District School Board (OCDSB) are suing Meta Platforms Inc., Snap Inc., and ByteDance Ltd. The school boards are seeking $4 billion in damages.

They argue that the negligent design of the major social media platforms has rewired student psychology and changed the way children think, behave, and learn, leaving teachers to manage the fallout.

“The algorithmic designs underlying the social media products are causing significant disruption and harm to the education system and to the student population,” says Duncan Embury, lawyer for the school boards. “That takes all kinds of forms.”

This includes reducing the attention and focus required for learning and increasing security-related incidents caused by sexting, cyberbullying, and emotional dysregulation, says Embury, who practises at Neinstein Personal Injury Lawyers. “Things that the schools are encountering on a day-to-day, hour-to-hour, minute-to-minute basis that are correlated, we say, to the algorithmic designs underlying social media products.”

While the algorithms are proprietary, he hopes that through the document and oral discovery process, further information will be forthcoming on how the algorithms were coded, how they were designed to work, and the intention behind them.

A spokesperson for TikTok told Law Times that the company has “industry-leading safeguards such as parental controls, an automatic 60-minute screen time limit for users under 18, age restrictions on features like push notifications, and more.” TikTok’s team of safety professionals constantly evaluates emerging practices and insights to support user well-being, they said.

“Snapchat was intentionally designed to be different from traditional social media, with a focus on helping Snapchatters communicate with their close friends,” said a spokesperson from Snapchat. “Snapchat opens directly to a camera – rather than a feed of content – and has no traditional public likes or comments.

“While we will always have more work to do, we feel good about the role Snapchat plays in helping close friends feel connected, happy and prepared as they face the many challenges of adolescence," the spokesperson said.

Mark Zuckerberg, the CEO of Facebook – which owns Instagram – recently told a congressional committee that scientific evidence has yet to show a causal link between poor mental health and social media use among young people.

According to Sujata Gupta of ScienceNews, experts have tracked a link between youth social media use and increased depression and anxiety, and a decrease in well-being.

Last October, a coalition of 32 state attorneys general in the US filed a federal lawsuit against Meta, arguing the social media company is contributing to a youth mental health crisis. The lawsuit alleges that the company knowingly designed products intended to addict children and teens.