WASHINGTON—Frances Haugen, who gathered internal documents showing how Facebook Inc.’s FB -4.89% Instagram app led to depression and anxiety in many teenage girls, is set to appear Tuesday before a Senate panel that is looking to toughen the law protecting children online.
The Instagram disclosures, along with other documents Ms. Haugen gathered while employed at Facebook, formed the foundation of The Wall Street Journal’s Facebook Files series.
Ms. Haugen “has energized and emboldened the effort to protect children and to hold Facebook accountable,” Sen. Richard Blumenthal (D., Conn.), the chairman of the Senate consumer protection subcommittee, said Monday. The hearing will be a “breakthrough day in the fight against Facebook’s destructive harm and concealment,” he said.
Facebook has disputed the characterization of the documents in the Journal and by Mr. Blumenthal and other members of his committee, who questioned Facebook executive Antigone Davis about the documents last week.
“It is not accurate that leaked internal research demonstrates Instagram is ‘toxic’ for teen girls,” Facebook said in its statement. “The research actually demonstrated that many teens we heard from feel that using Instagram helps them when they are struggling with the kinds of hard moments and issues teenagers have always faced.”
The Journal has defended the series, saying Facebook hasn’t identified any factual errors.
SHARE YOUR THOUGHTS
What actions, if any, should the Senate Commerce Committee take on Facebook? Join the conversation below.
Ms. Haugen, who resigned from Facebook in April, was a product manager hired to help protect against election interference on Facebook. She said she acted because she was frustrated by what she viewed as Facebook’s lack of openness about the platforms’ potential for harm and its unwillingness to address its flaws.
Ms. Haugen has sought federal whistleblower protection at the Securities and Exchange Commission. She is also interested in cooperating with state attorneys general and European regulators.
The Instagram disclosures have built momentum to update the Children’s Online Privacy Protection Act, a 1998 law governing websites that gather data on children. The law, known as Coppa, has been widely criticized as inadequate in the age of social media.
“Updating Coppa will be essential,” Sen. Maria Cantwell (D., Wash.), who chairs the powerful Commerce Committee, said at last week’s hearing.
Critics say the law as written has measures that create enforcement challenges for the Federal Trade Commission.
One is its requirement that a platform operator have “actual knowledge” that it is collecting personal information of children before the law’s toughest restrictions kick in. The other is its age cutoff—only children under 13 get its strongest protections.
Republicans and Democrats alike have supported updating the law.
“I have three daughters and when I read The Wall Street Journal story I was shocked but in some ways not surprised,” said Sen. Dan Sullivan (R., Alaska) at last week’s hearing. “I personally believe that we’re going to look back like 20 years from now and see the massive social mental health challenges that were created by this era…we’re going to go, ‘What in the hell were we thinking?’ ”
In addition to the Instagram documents, Ms. Haugen released other internal documents that could come up for discussion at Tuesday’s hearing, including how the company’s moderation rules favor elites; how its algorithms foster discord; and how drug cartels and human traffickers use its services openly.
In addition, it is possible that lawmakers—particularly Democrats—will focus on what role Facebook may have played in the riot that engulfed the U.S. Capitol on Jan. 6.
In a statement this week, Mr. Blumenthal promised more hearings “documenting why Facebook and other tech companies must be held accountable—and how we plan to do that…We must consider stronger oversight, effective protections for children and tools for parents, among the needed reforms.”
In its statement, Facebook said that its “teams have to balance protecting the ability of billions of people to express themselves openly with the need to keep our platform a safe and positive place.”
“We continue to make significant improvements to tackle the spread of misinformation and harmful content,” the company said. “To suggest we encourage bad content and do nothing is just not true.”
Write to John D. McKinnon at john.mckinnon@wsj.com
Copyright ©2021 Dow Jones & Company, Inc. All Rights Reserved. 87990cbe856818d5eddac44c7b1cdeb8