More Evidence That Meta is Not Protecting Children

Evidence continues to mount that Meta‘s (formerly Facebook) social media platforms are not safe for kids. Under co-founder and CEO Mark Zuckerberg‘s leadership, the company has repeatedly avoided making the necessary reforms to protect underage users from the negative effects of its platforms.

While Meta has promised to curate a more age-appropriate experience for adolescent users, recent tests conducted by a Wall Street Journal reporter and an academic researcher found the the company’s Instagram platform recommends sexual content to underage users:

Instagram served a mix of videos that, from the start, included moderately racy content such as women dancing seductively or posing in positions that emphasized their breasts. When the accounts skipped past other clips but watched those racy videos to completion, Reels recommended edgier content.

 

Adult sex-content creators began appearing in the feeds in as little as three minutes. After less than 20 minutes watching Reels, the test accounts’ feeds were dominated by promotions for such creators, some offering to send nude photos to users who engaged with their posts.

 

Similar tests on the short-video products of Snapchat and TikTok didn’t produce the same sexualized content for underage users.

Further, as reported by the Journal, young female influencers often amass large followings of adult men. Even parental supervision is often not enough to protect the children from dangerous exposure.

Earlier this week, the US Surgeon General called for warning labels to be placed on social media platforms. Per the Journal:

Warning labels, similar to those on alcohol and tobacco products, should accompany platforms to “regularly remind parents that social media has not been proved safe,” Dr. Vivek Murthy said in an op-ed for the New York Times Monday.

 

Murthy cited research showing that social media was an important contributor to a growing mental-health crisis among young people.

 

“Adolescents who spend more than three hours a day on social media face double the risk of anxiety and depression symptoms, and the average daily use in this age group, as of the summer of 2023, was 4.8 hours,” Murthy said.

National Legal and Policy Center presented a shareholder proposal at Meta’s annual meeting last month, asking the company to examine raising the minimum age to use its social media platforms, and put its findings to an advisory vote. NLPC argued that the wholesale change was necessary to protect children from the pervasive harms from social media. Meta’s board of directors opposed the proposal. As these new developments show, the company still doesn’t take child safety seriously.

Previous

Next

Tags: Big Tech, child grooming, Facebook, Mark Zuckerberg, Meta, social media