From The Verge: In a blog post published today, Meta says it’s expanding and updating its child safety features aimed at protecting kids — even as reports pile up about how its platforms recommend content sexualizing children.
Over the course of several months, The Wall Street Journal has detailed how Instagram and Facebook serve up inappropriate and sexual child-related content to users. In June, a report detailed how Instagram connects a network of accounts buying and selling child sexual abuse material (CSAM), guiding them to each other via its recommendations algorithm. A follow-up investigation published today shows how the problem extends to Facebook Groups, where there’s an ecosystem of pedophile accounts and groups, some with as many as 800,000 members.
In both cases, Meta’s recommendation system enabled abusive accounts to find each other, through features like Facebook’s “Groups You Should Join,” or autofilling hashtags on Instagram. Meta today said it will place limits on how “suspicious” adult accounts can interact with each other: on Instagram, they won’t be able to follow one another, won’t be recommended, and comments from these profiles won’t be visible to other “suspicious” accounts.
View: Full Article