Mark Zuckerberg is facing new challenges after a former employee turned whistleblower is set to testify before Congress, potentially revealing more damning allegations about how the social network operates.

Facebook was already under fire after The Wall Street Journal published a series of articles alleging that the company was aware of how potentially harmful Instagram, the photo sharing app it owns, can be for teenagers.

Despite denying the allegations, Facebook announced in late September that it would postpone the launch of its planned “Instagram Kids” platform, which would have targeted children aged 13 and under. The allegations that Facebook knew Instagram could have a negative impact on children were made after a whistleblower leaked tens of thousands of internal documents to The Wall Street Journal and law enforcement.

In a 60 Minutes interview, the whistleblower identified herself as Frances Haugen, a 37-year-old data scientist who joined Facebook in 2019.

During the broadcast on Sunday night, Haugen accused Facebook of lying about the progress it is making in combating hate speech, violence, and the spread of misinformation on the app, and of prioritizing company growth over public safety. Haugen claimed that the company made a number of decisions that they knew could cause real-world harm in order to increase profits, such as changing their algorithm so that posts with high engagement are pushed into people’s feeds.

According to Haugen, Facebook made this decision in 2018, despite its own research showing that “hateful, divisive, and polarizing content” receives the most engagement.

According to Haugen, while Facebook did take steps to control the spread of misinformation in the run-up to the 2020 election, these were reversed shortly after the results were announced in November, and the algorithm that pushed user engagement returned. According to Haugen, as a result of this, Facebook played a role in the January 6 attack on the Capitol.

“Facebook has realized that if it changes the algorithm to make it safer, people will spend less time on the site,” she explained to 60 Minutes. “They’ll click on fewer ads and make less money.”

Haugen also stated that she “has a lot of empathy” for Zuckerberg because he “never set out to create a hateful platform.”

“But he has allowed choices to be made,” she continued, “and the side effects of those choices are that hateful, polarizing content gets more distribution and reach.” This week, Haugen is scheduled to testify before Congress, where she will argue that the federal government should assist in imposing restrictions on Facebook.

Antigone Davis, Facebook’s global head of safety, was grilled by senators on September 30 during a Senate commerce, science, and transportation subcommittee hearing about claims that the company was aware that Instagram could cause harm to children.

Facebook released annotated slideshows for two internal research reports—”Teen Mental Health Deep Dive,” published in October 2019, and “Hard Life Moments,” published in November 2019—ahead of the hearing, claiming that they show “It is simply not accurate that this research demonstrates Instagram is ‘toxic’ for teen girls.”

The annotations even suggested that instead of the headlines cited by The Journal, more accurate headlines should have been used in the internal documents.

According to newspapers, a number of people who worked on the reports were dissatisfied with how the company criticized The Journal for basing its reporting on what it called the document’s limited and imprecise findings in order to save face.