Facebook has said it will be introducing several features including prompting teenagers to take a break when using its photo-sharing app Instagram, and "nudging" teens if they are repeatedly looking at content considered not conducive to their wellbeing.
News of the controls came in the aftermath of recent damning testimony that the company’s platforms harm children.
California-based Facebook is also planning to introduce new controls for adults of teenagers on an optional basis so that parents or guardians can supervise what their youngsters are doing online.
These initiatives come after Facebook announced late last month that it was pausing work on its Instagram For Kids project.
But critics say the plan lacks details and they are sceptical that the new features will be effective.
The new controls were outlined on Sunday by Nick Clegg, Facebook's vice president for global affairs and former British deputy prime minister, who made the rounds on various Sunday news programmes in America where he was grilled about the firm's use of algorithms as well as its role in spreading harmful misinformation ahead of the January 6th Capitol riots in Washington DC.
"We are constantly iterating in order to improve our products," Mr Clegg told Dana Bash on the State Of The Union show.
“We cannot, with a wave of the wand, make everyone’s life perfect. What we can do is improve our products, so that our products are as safe and as enjoyable to use.”
Mr Clegg said that Facebook had invested $13 billion (€11.22 billion) over the past few years to keep the platform safe and that the company had 40,000 people working on these issues.
Whistleblower
The flurry of interviews came after whistleblower Frances Haugen, a former data scientist with Facebook, appeared before Congress this week to accuse the social media platform of failing to make changes to Instagram after internal research showed apparent harm to some teenagers and of being dishonest in its public fight against hate and misinformation.
Ms Haugen’s accusations were supported by tens of thousands of pages of internal research documents she secretly copied before leaving her job in the company’s civic integrity unit.
Josh Golin, executive director of Fairplay, a watchdog overseeing children's affairs and the media marketing industry, said he did not think introducing controls to help parents supervise teenagers would be effective, since many youngsters set up secret accounts any way.
He was also dubious about how effective nudging teenagers to take a break or move away from harmful content would be.
He noted that Facebook needed to show exactly how they would implement the changes and offer research that showed these tools were effective.
“There is tremendous reason to be sceptical,” he said.
He added that regulators needed to restrict what Facebook did with its algorithms.
Mr Golin said he also believed Facebook should cancel its Instagram project for youngsters.
When Mr Clegg was grilled in interviews about the use of algorithms in amplifying misinformation ahead of the January 6th riots, he said that if Facebook removed its algorithms, people would see more hate speech, not less, and more misinformation.
Mr Clegg said the algorithms served as “giant spam filters”.
Speaking on a news show on Sunday, Democratic senator Amy Klobuchar said it was time to update children's privacy laws and offer more transparency in the use of algorithms.
“I appreciate that he is willing to talk about things, but I believe the time for conversation is done,” said Ms Klobuchar, referring to Mr Clegg’s plan. “The time for action is now.” – AP