Facebook Wants to Make Instagram Less Harmful to Teenagers
Following the revelations about Instagram’s harmfulness, Facebook promises it will lead young people to different content if they watch the same things for too long. So the question is whether Facebook wants to protect young people, or especially itself.
After it became known last week that Facebook is well aware of how damaging Instagram is to the self-esteem of young users, the company is making a series of promises to make the platform better. In US media, the vice president of global affairs Nick Clegg says changes are coming.
Algorithms “must be held accountable, if necessary by regulation so that people can verify that what our systems say they do is actually happening.” According to Clegg in the CNN program State of the Union
In addition, he promises that young users will be better protected. ‘We are going to introduce something that will make a significant difference. If the system sees that the teen is watching the same content over and over, which is not good for their well-being, then we push them toward different content,” Clegg told CNN. Additionally, teens will also be notified to take a break from Instagram.
The fact that Facebook promises to improve its algorithms after a scandal has been modus operandi at Facebook for several years now. In 2020, it promised to investigate whether its algorithms were racist. In 2018, that algorithm ensured that messages of support after an earthquake were decorated with balloons and confetti.
Whether it will keep that promise is unclear. For example, after the failed coup in the US, it promised that users would no longer be suggested politically oriented groups. A few weeks later, that turned out not to be the case.
But also insight into its functioning and algorithms is something that Facebook absolutely does not want independent research into. So, for example, researchers who do so are resolutely banned from the platform.
What Facebook usually wants to do with such concessions is prevent parliaments from voting laws that hit Facebook more harshly than what it wants. So, for example, in 2019, Zuckerberg promised more focus on privacy, but this year gave external employees access to everyone’s messenger messages.
Usually, after new scandals, things are tightened up, and more openness is promised along with apologies or the promise that it won’t happen again, and then it happens again.