NLPC Proposal: Meta Has Failed to Protect Children

National Legal and Policy Center presented a “Report and Advisory Vote on Minimum Age for Social Mediaproposal at the Meta Platforms, Inc.‘s (formerly Facebook) 2024 annual meeting of shareholders today.

The proposal criticizes the company for failing to protect children from the numerous harms caused by its platforms, including addiction, mental health decline, and increased risk of grooming and exploitation. Data collected by a variety of groups, including Meta’s internal research team, show that adolescents are particularly vulnerable to these effects.

Unfortunately, the company’s executive team, led by co-founder Mark Zuckerberg, has failed to increase protections for underage users. The proposal urges the company to examine the effects of raising the minimum age to use its social media platforms, disclose its findings to shareholders, and hold an advisory vote on raising the minimum user age.

The company’s board of directors opposed our proposal, as explained on pages 98-100 of its 2024 proxy statement. NLPC’s response to the board’s opposition statement was filed with the Securities and Exchange Commission last month.

Presenting the proposal at the meeting was Luke Perlot, associate director of NLPC’s Corporate Integrity Project. His three-minute remarks can be heard here, and a transcript follows:

Good morning,

 

Led by Mark Zuckerberg, Meta, formerly known as Facebook, has grown into a global behemoth that dominates digital communication. Yet, this immense power comes with profound responsibilities, particularly towards younger users who are most vulnerable to the platform’s pitfalls. Despite growing evidence and public concern, there has been a disconcerting lack of decisive action from Mr. Zuckerberg and his team to address the severe risks these platforms pose to children.

 

In 2023, the U.S. Surgeon General issued a stark warning about the risks social media poses to the mental health of our young ones. These include addiction, exposure to inappropriate content, and a heightened risk of exploitation.

 

Meta’s platforms are scientifically optimized to maximize engagement through notifications, likes, and infinite scrolling, which release dopamine—a neurotransmitter linked to pleasure and addiction. According to the American Psychological Association, these features may not be suitable for adolescents, who are more easily addicted.

 

Further, the Company’s internal researchers have been aware for several years that Instagram may negatively impact the mental health and wellbeing of young users, particularly females. Yet Meta has done nothing.

 

Young users are also at risk of cyberbullying. According to the Pew Research Center, about 46% of young people between the ages of 13 and 17 have been bullied online.

 

The FBI has also warned that there is a growing problem of sextortion, a form of blackmail in which a perpetrator threatens to release explicit images of the victim – who is often underage – unless they acquiesce to certain demands, often for more images, sexual favors, or money.

 

Moreover, the company has found itself at the center of numerous lawsuits, including allegations that it failed to prevent the spread of child sexual abuse material—with Meta platforms generating 95% of the 29 million CSAM reports received by the National Center for Missing & Exploited Children in just one year.

 

Meta, under Mr. Zuckerberg’s stewardship, has been notably slow in implementing effective safeguards. In 2021, he turned down a proposal brought by members of his senior leadership to expand the Company’s child safety and well-being team. The Company may try to hide behind its existing safeguards all it wants, but clearly they are not working.

 

Meta should consider another option – raising the minimum age to use its platforms. The is the simplest and most obvious way to protect children and adolescents from the aforementioned risks. Additionally, raising the minimum age would protect Meta from the legal and reputational damage it has already incurred from exposing children to such risks.

 

For these reasons, we encourage our fellow shareholders to vote FOR Proposal Twelve.

Read NLPC’s shareholder proposal for the Meta annual meeting here.

Listen to Luke Perlot’s presentation of the proposal at the meeting here.

Read NLPC’s response, filed with the SEC, to the company’s opposition to our shareholder proposal, here.

 

Previous

Next

Tags: Big Tech, child grooming, Facebook, Mark Zuckerberg, Meta, social media