WASHINGTON—Instagram’s top executive appeared Wednesday before a Senate panel investigating possible harm to young people using the photo-sharing app and what its parent company knew.

Instagram head

Adam Mosseri

faces questioning from the Senate Commerce Committee’s consumer-protection panel on internal company research showing the app can worsen body-image issues for some girls. Disclosure of the research in The Wall Street Journal’s Facebook Files series prompted several previous legislative hearings. Instagram is a unit of

Meta Platforms Inc.,

which also owns Facebook.

In his opening statement, subcommittee Chairman Richard Blumenthal (D., Conn.) said the current mental health crisis among young people has been exacerbated by big tech. Recent hearings have showed that “big tech actually fans those flames with addictive products and sophisticated algorithms that can exploit and profit from children’s insecurities and anxieties.”

Members of Congress have likened Facebook and Instagram’s tactics to that of the tobacco industry. WSJ’s Joanna Stern explores what cigarette regulation can tell us about what may be coming for Big Tech. Photo illustration: /The Wall Street Journal

He said that “our mission now is to do something about it,” adding that the “time for self-policing and self-regulation is over.”

In his prepared testimony, Mr. Mosseri said that online safety is “an area our company has been focused on for many years, and I’m proud of our work to help keep young people safe, to support young people who are struggling, and to empower parents with tools to help their teenagers develop healthy and safe online habits,” Mr. Mosseri said in prepared testimony.

Mr. Mosseri endorsed an “industry body” that would determine best practices on at least three crucial issues for social media that draw younger users: how to verify user age, how to design age-appropriate experiences, and how to add more parental controls.

He also expressed support for measures requiring tech companies to adhere to such industry standards in order to qualify for the current federal legal protections that social-media platforms enjoy.

Before the hearing, Instagram said it would implement new tools to protect teens who use the app. They include prompts to suggest users take breaks, controls for parents to curtail their children’s usage, limits on tagging or mentioning teen users, and the ability for users to bulk-delete their own photos, videos and other content.

Those measures, however, might not go far enough to satisfy lawmakers. Sen. Marsha Blackburn (R., Tenn.) said Tuesday the new Instagram tools were an attempt to shift attention from their mistakes.

“Instagram’s repeated failures to protect children’s privacy have already been exposed before the U.S. Senate,” said Ms. Blackburn, the subcommittee’s top Republican. “Now, it is time for action. I look forward to discussing tangible solutions to improve safety and data security for our children and grandchildren.”

Some lawmakers, including Ms. Blackburn, want Instagram to abandon plans to roll out a version tailored to children, similar to YouTube Kids and other products. Mr. Mosseri announced a pause on those plans in September, but said he still believed in the idea as a way to protect preteens who today might use the app despite its minimum required age of 13.

Senators said they are working on legislation to address issues raised at the hearings, but so far talks haven’t yielded proposals with broad momentum.

Sen. Ed Markey (D., Mass.), who helped author a children’s privacy law in the late 1990s, has been meeting recently with Republican senators, including Roger Wicker of Mississippi, the senior GOP member of the Senate Commerce Committee, to discuss a ban on targeted ads directed at children, among other topics, an aide to Mr. Markey said.

Share Your Thoughts

What changes, if any, would you like to see made at Instagram? Join the conversation below.

On Thursday, a separate Senate subcommittee on communications policy is scheduled to hold a hearing on legislative solutions for “dangerous algorithms” that “manipulate user experiences.”

Wednesday’s hearing of the consumer-protection subcommittee is the latest in a series started in September after the Journal published the Facebook Files. Frances Haugen, a former Facebook employee turned whistleblower, appeared before the panel Oct. 5. The company has disputed her characterization of its culture and decision making, saying it works hard to keep consumers safe and many users benefit from its apps.

Lawmakers later questioned executives from ByteDance Ltd.’s TikTok,

Snap Inc.

‘s Snapchat and

Alphabet Inc.’s

YouTube about children’s safety online.

Write to Ryan Tracy at ryan.tracy@wsj.com and John D. McKinnon at john.mckinnon@wsj.com

Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8



Source link

By INFO

Leave a Reply

Your email address will not be published.