House to Debate New Bills Aimed at Protecting Children Online

The United States House of Representatives is set to address a significant package of legislation designed to enhance online safety for children following the holiday recess. The House Energy and Commerce Committee has scheduled a hearing for the week of March 18, 2024, to consider a total of 19 proposed bills, including a revised version of the Kids Online Safety Act (KOSA). This initiative aims to tackle the perceived dangers that social media and emerging artificial intelligence tools pose to young users.

Concerns regarding the safety of children online have prompted lawmakers to take action. In a joint statement, Committee Chair Brett Guthrie (R-Ky.) and Rep. Gus Bilirakis (R-Fla.), Chair of the Subcommittee on Commerce, Manufacturing and Trade, emphasized the necessity of implementing meaningful protections. They stated, “For too long, tech companies have failed to adequately protect children and teens from perils online.” The legislators hope that this legislative effort will be a crucial step toward safeguarding youth in a rapidly evolving digital landscape.

Bipartisan interest exists in Congress to address the dangers associated with online interactions for children. However, reaching a consensus on the specifics of the regulations has proven challenging, especially in light of concerns surrounding free speech. The rise of artificial intelligence and its growing significance to the U.S. economy has further complicated discussions, with lawmakers wary of imposing regulations that could hinder technological advancement.

Among the bills under consideration are measures that would regulate app stores, impose limitations on chatbots, and raise the minimum age for social media usage. The revised KOSA is particularly noteworthy, as it has undergone modifications to address previous objections from Republican lawmakers and to navigate potential legal challenges. Notably, the updated version omits the “duty of care” language that would have held companies legally accountable for minimizing harm from their services. Instead, it requires platforms to implement “reasonable policies, practices, and procedures” to manage issues such as threats of violence, sexual exploitation, scams, and the distribution of illegal substances.

Another significant proposal included in the legislative package mandates that app stores verify the ages of users. While some states have enacted similar measures, these have faced legal challenges and raised concerns about implementation and data privacy. The tech industry is divided on the age verification issue, with app stores arguing that developers should take responsibility, while social media companies contend that app stores are better positioned to enforce such regulations.

Additional bills propose banning children under the age of 16 from holding social media accounts, limiting access to features that allow disappearing messages, and restricting companies’ abilities to collect data and conduct market research involving minors. The path forward for these bills remains uncertain, as lawmakers continue to explore various approaches to regulation. The removal of the “duty of care” provision from KOSA may create hurdles, despite it already garnering sufficient support in the Senate to overcome a filibuster if legislators maintain their stance from the previous year.

The scrutiny faced by social media companies has intensified, particularly as research highlights the negative impact of these platforms on children’s mental health. Whistleblower testimonies and court documents have suggested that companies, including Meta, knowingly concealed the harmful effects of their platforms to retain younger users. Recently unsealed court filings alleged that executives at Meta compared Instagram to a drug and actively worked to obscure its detrimental consequences. Although Meta has denied these claims, asserting that they represent cherry-picked quotes, the company remains a focal point for criticism regarding its handling of online youth safety.

In response to mounting concerns, Meta has introduced several measures aimed at enhancing protections for underage users, including the creation of Teen Accounts. However, these initiatives have not significantly alleviated apprehensions among parent groups and lawmakers. Senator Marsha Blackburn (R-Tenn.), a prominent advocate for KOSA, expressed frustration over the situation, stating, “Meta’s exploitation of children and adults is by design. It’s time to pass the Kids Online Safety Act and stand up for the next generation. This is what happens when Congress fails to act.”

As the House prepares to deliberate on this critical legislative package, the outcome remains uncertain. The upcoming hearings will play a crucial role in determining which measures will advance and how they will shape the future of online safety for children in the United States.