The testimony will mark Mosseri’s first appearance before Congress. It also makes him the most high-profile executive from Meta, the social media company formerly known as Facebook, to agree to testify since Facebook whistleblower Frances Haugen leaked hundreds of internal company documents. Some of those documents showed that the company’s own researchers have found Instagram can damage young users’ mental health and body image, and can exacerbate dangerous behaviors such as eating disorders.
“After bombshell reports about Instagram’s toxic impacts, we want to hear straight from the company’s leadership why it uses powerful algorithms that push poisonous content to children driving them down rabbit holes to dark places, and what it will do to make its platform safer,” Sen. Richard Blumenthal, who chairs the Subcommittee on Consumer Protection, Product Safety and Data Security, said in a statement to CNN Business. Blumental previously called on Mosseri or Meta CEO Mark Zuckerberg to testify about Instagram’s impact on kids.
Mosseri, a longtime Facebook exec who has headed Instagram since 2018, confirmed his plan to testify in a video posted to his Twitter account Wednesday. Mosseri said the company and lawmakers “have shared goals.”
“We all want young people to be safe when they’re online so I look forward to these conversations,” he said, “and you’re going to hear more from us about safety, not only at Instagram but at Meta more broadly.”
In a statement to CNN Business, Meta spokeswoman Dani Lever said: “We continue to work with the committee to find a date for Adam [Mosseri] to testify on the important steps Instagram is taking.”
The New York Times was first to report Mosseri had agreed to testify.
The announcement of the hearing comes amid regulatory pressure on Meta and Instagram. Last week, a bipartisan group of state attorneys general launched an investigation into the potential harms of Instagram for children and teens. (Meta has said allegations made by the attorneys general are false.) Ohio Attorney General Dave Yost also sued Meta for allegedly misleading the public about its algorithm and the harms its apps can cause to users, a suit the company says is without merit.
The Wall Street Journal first reported in September on what the company’s internal documents and research show about Instagram’s impact on young people. The report said Facebook knew Instagram was “toxic” for teen girls. Meta has pushed back on the Journal’s reporting, and said its apps do more good than harm.
In September, lawmakers held a hearing with Facebook’s head of global safety, Antigone Davis, where lawmakers grilled her on Instagram’s effects on kids. Although Davis said the company was “looking for ways to release more research” that she suggested might paint a different picture about the platform, she was criticized for not more firmly agreeing to release more internal information about the platform.
The company announced it was pausing plans to develop a version of Instagram designed for kids in late September, amid the fallout from the Journal report.
Instagram has also pointed to its other efforts to develop features to protect young people, including a “Take a Break” reminder, which was announced in October amid intense scrutiny. In his Twitter video Wednesday, Mosseri also discussed tools such as “hidden words,” which gives users more control over what people can say in their direct messages and comments. He added that the company is also building controls for parents to limit how much time their kids spend on the app.
™ & © 2021 Cable News Network, Inc., a WarnerMedia Company. All rights reserved.