NewsDenver7 360 | In-Depth News, Opinion

Actions

360: Are social media companies legally responsible if their algorithms serve harmful content to minors?

social media.jpeg
Posted
and last updated

DENVER — Life for the Schotts changed forever on Nov. 15, 2020, when the Logan County family lost their daughter and sister Annalee, 18, to suicide.

“She was our beautiful, kind, softhearted daughter — [and] sadly, she took her life,” said mother Lori Schott. “You know, her brothers and her dad and I have to face it every day. What happened? What went wrong? How did we miss it?”

As Schott told Denver7 in an interview at their family farm, she stumbled upon clues Annalee left behind. In journal entries, Annalee had described the intense inner pain caused by feeling she didn’t measure up to her friends on social media. Screenshots from her TikTok account suggest she began receiving a deluge of content in her feed talking about self-harm, self-hate, and suicide.

Just as Lori Schott was discovering these, a testimony before Congress stopped her in her tracks. She said Frances Haugen, a former data scientist for Facebook, testified before a Congressional committee and pulled back the curtain on Facebook and Instagram (under parent company Meta), revealing that the company was aware it was causing harm to many users and weren’t taking the necessary steps to fix the problems.

"I just couldn’t believe what I was reading," Schott said.

Lori Schott

Denver7 360 | In-Depth News, Opinion

Colorado mom pushes for online regulation after losing daughter to suicide

Rob Harris

In particular, Haugen said the platforms were profiting off self-harm and self-hate, particularly among teenage girls. Haugen said the company’s products "harm children, stoke division, and weaken our democracy," adding that the company's leadership "knows how to make Facebook and Instagram safer, but won’t make the necessary changes because they have put their astronomical profits before people."

Hearing that knocked the wind out of Lori Schott, who was still processing what had led her daughter to such a dark place at such a young age.

But it wasn’t just parents who were blown away by Haugen’s testimony.

It caught the attention of many lawmakers, too.

A rare bipartisan mission

In an era of intense fighting in Washington, the mission of online safety — particularly for kids — has been bipartisan.

The Kids Online Safety Act (KOSA) is one of the proposals to come out of this bipartisan mission. The bill would create a legal “duty of care” of social media platforms, holding them responsible if their algorithms boost material deemed harmful to minors.

Examples of harmful material in the bill include the promotion of self-harm and suicide, eating disorders, substance abuse, and child sexual abuse material. It also requires the companies to ensure their platforms are not an addiction for minors.

Social media

Local

200 advocacy groups urge Congress to pass Kids Online Safety Act

Jessica Porter

The current version of KOSA before Congress this year has dozens of cosponsors, including Colorado Sen. John Hickenlooper. But even though it has bipartisan support, it does not have unanimous support.

In fact, a version of KOSA was introduced last year, and failed to gain enough support to pass. It was reintroduced in May 2023.

Concerns of censorship

The main concern that has been repeatedly raised by opponents of KOSA has to do with the main “duty of care” requirement, and what will ultimately be deemed harmful material if the bill becomes law.

Aaron Mackey, free speech and transparency director for the Electronic Frontier Foundation, argued the bill will lead to censorship of information for kids and adults alike, with the bill becoming a political tool rather than a safety one.

“Trying to access information about just basic sexual health, contraception, and birth control, abortion access — all of those things, right? If you got 10 parents in a room, they might disagree about whether that information is harmful,” Mackey said.

The Electronic Frontier Foundation is just one of many groups that has raised this concern, which, in part, led to Congress abandoning the previous version of KOSA. While amendments have been made in response to these concerns in the bill’s current version, Mackey said his fundamental objection remains.

He argued for a set of different online protections, aimed instead at broader privacy concerns.

“We can definitely talk about things like passing a comprehensive consumer data privacy law that ensures that the information we give to platforms we have control over, and that we don’t give them information without actually agreeing to first give them information,” Mackey said. “And, that will actually help, right? A lot of these downstream harms.… The platforms will not be able to have this well of personal information about us that they can then use to target ads and target content at us as well.”

Fixes aim to address concerns

Among the bipartisan coalition of politicians backing KOSA are advocacy groups also pushing for it to become a law.

Among them is Fairplay, an organization that advocates for more protections for our kids online. Haley Hinkle, their policy counsel, said the bill will hold tech companies accountable for the content they are serving to young people and that recent amendments address the main objections being raised.

“The duty of care in KOSA is a list of, you know, specific, enumerated harms," she said. "It’s not just a sort of free-for-all concept of what’s harmful to kids. It’s really targeting, you know, features and functions that exacerbate mental health issues, anxiety, depression, eating disorders [and] patterns of use that encourage problematic addiction-like behavior.”

Hinkle pointed out that KOSA would not prevent users — of any age — from seeking out content deemed harmful on their own. Rather, it would require social media companies to make sure their algorithms were not feeding said content to kids.

“This isn’t about the existence of content on the internet,” she said. “It’s really about the design choices that are pushing things to kids in their feeds.”

For Annalee, Lori Schott is backing KOSA

As for Lori Schott, the discoveries she has made since losing her daughter to suicide have motivated her to back KOSA, and share her story anywhere she can to convince others.

“[The platforms] can tell us — all they want — that they’re guarding our children. They’re not,” Schott said, becoming emotional. “We have to hold them accountable. We cannot see that losing another child is OK.”

The current version of the Kids Online Safety Act was introduced in the U.S. Senate in May 2023, and is currently sitting in the Committee on Commerce, Science, and Transportation.

360: Are social media companies legally responsible if their algorithms serve harmful content to minors?

Editor's Note: Denver7 360 | In-Depth explores multiple sides of the topics that matter most to Coloradans, bringing in different perspectives so you can make up your own mind about the issues. To comment on this or other 360 In-Depth stories, email us at 360@Denver7.com or use this form. See more 360 | In-Depth stories here.