Local expert weighs in on landmark social media decisions
SOUTH BEND, Ind. -- Two landmark trials this week and two losses for social media giants.
In the Los Angeles case, a jury found Meta and YouTube, which is owned by Google, negligent for designing apps that harmed kids and failed to warn them of potential dangers, with a total of $6 million in damages to be paid to the plaintiff.
In New Mexico, a jury found Meta knowingly concealed information about child sexual exploitation on its platforms and misled users about the safety of Facebook, Instagram, and WhatsApp, violating state consumer protection law. The jury imposed a $375 million penalty on Meta.
Lisa Schirch, Ph.D.,is the Richard G. Starmann, Sr. Professor of the Practice of Peace Studies in the Keough School of Global Affairs at the University of Notre Dame. Schirch is an expert in social media design.
She says in the early years of social media, big companies like Meta had small teams looking at things like scams and fraud.
“But when they monetize the platforms, that means, when they started putting ads on the platform and really trying to optimize how much time people were spending on the platforms, they sort of really generated a lot more negative content,” said Schirch.
Schirch says companies have resisted changes to their algorithms, designed to pull people in.
“Before social media, we were not constantly getting social feedback like this. So this is like, one of the main design features that we don't like, is this notification and the liking, the public liking, so it sets up this like competition, and then people, girls and boys, who don't find that they're getting that kind of social feedback, they feel badly and they get depressed. And this is where the anxious generation, this sort of research on, like, how this entire generation of kids are depressed and anxious, this is where this is coming from,” said Schirch.
Schirch says she wants to see more competition in the social media space and regulators are going to have to address design principles.
“There's better, safer ways to design platforms so kids can have that social experience online without all this social competition, without the negatives of what their algorithm is giving them in terms of harmful content,” said Schirch.
Schirch says we’ll see more cases like these in the future, pressing large social media companies.
“These cases are coming before the courts, but we also know in the court of public opinion, both republicans and democrats, there's bipartisan support for safety for kids on social media,” said Schirch.
Yesterday's decision in LA could impact the outcome of more than a thousand similar lawsuits.
These trials brought up a longtime legal protection, Section 230 of the 1996 Communications Decency Act.
It broadly protects internet platforms from legal liability over content posted by users.
According to Reuters, plaintiffs were able to get around that by saying companies harmed users based on the design of the platforms, not the content itself. In both cases, Meta claimed they were not liable under Section 230, as did Google in the Los Angeles case, when urging the judges to dismiss the lawsuits. The judges rejected the argument and allowed the cases to move to trial.
“Section 230 protects freedom of speech, and it says that companies cannot be held liable for what is on their platform. What this is really doing is looking at the design of those platforms and saying that the companies are making design choices that are not neutral. They're making design choices that amplify certain content and so some content, and especially the harmful content, gets amplified, get rises to the top of people's news feed. And so this case is unique, because it's really focused on the design of those platforms, saying this is not neutral and this is not protected speech,” said Schirch.