A recent Wall Street Journal investigation revealed several internal metrics that Facebook uses or has been aware of and either ignored or been less than transparent about, including the harmful effects that its products – most notably, Instagram – can have on the mental health of users.
Facebook seemingly responded by putting a pause on the release of "Instagram Kids," a version of the app in development that the company argued would create a safe experience for users under the age of 13.
In a post on Instagram’s company website, Adam Mosseri, the head of Instagram, explained the reasons for developing the app as well as for holding off on an imminent release.
"We firmly believe that it’s better for parents to have the option to give their children access to a version of Instagram that is designed for them – where parents can supervise and control their experience – than relying on an app’s ability to verify the age of kids who are too young to have an ID," the statement said.
"Our intention is not for this version to be the same as Instagram today. It was never meant for younger kids, but for tweens (aged 10-12)," the statement explained. "It will require parental permission to join, it won’t have ads, and it will have age-appropriate content and features."
"The list goes on."
During an appearance on "Good Morning America," Mosseri did not say whether the recent "Facebook Files" investigation influenced the company’s decision to cancel the project. Mosseri maintains that there’s a version of Instagram that can help instead of harm teens, and he wants to do what he can to ensure that version eventually sees release.
"I still firmly believe it’s a good thing to build a version of Instagram that’s meant to be safe for tweens, but we want to take the time to talk to parents and safety experts and get to more consensus about how to move forward," Mosseri told GMA’s Craig Melvin.
Some of the features that the company would look to implement include "nudge," which lets someone know that they’ve fixated on certain topics, or "take a break" to pause activity for a time during a big life moment, such as a break up or moving to a new school, including people’s ability to comment on content.
The main feature would look to include parents in monitoring and supervising their child’s content.
"The idea would be that parents can see what their kids are doing, they can manage how much time their kids spend on the app, and they can possibly approve who the kids message and who they can follow," Mosseri explained.
Mosseri insisted the app is still "a few months from launching," meaning the need to review the app and data is that much more critical. He argued that Instagram has regularly responded to criticism and introduced new features, but admitted that the process has not been perfect.
The "Facebook Files" investigation has caused a splash with consumers and lawmakers alike: Congress has asked both Facebook leadership and a whistleblower, who released some of the internal data to the Journal, to testify before the Senate Commerce Committee's consumer protection panel.
A source familiar with the matter confirmed to FOX Business that the whistleblower behind the leaked documents in the report has agreed to work with Congress, and Facebook also confirmed that the company will send its global head of safety, Antigone Davis, to testify.
Facebook leadership has already testified before Congress multiple times – most recently earlier this year on the spread of misinformation across social media platforms, including Google and Twitter.