Instagram’s new child safety features aim to protect minors from predatory behavior online
Concerns about online child exploitation have led Instagram to revamp its recommendation algorithm, particularly for accounts that showcase children but are managed by adults. After facing mounting criticism and lawsuits, the platform now avoids suggesting these accounts to adults who are flagged as suspicious — a move that significantly boosts protections for children and teenagers using the app. This Instagram algorithm update aligns with Meta’s broader efforts to improve child safety across its social platforms.
Instagram algorithm update limits exposure to suspicious adults
Instagram will no longer recommend adult-managed accounts that prominently feature children to adults who have been flagged as “potentially suspicious.” These include users previously blocked by teenagers or exhibiting questionable online behavior. Additionally, such adults won’t be able to leave public comments on posts from these accounts or easily discover them through Instagram Search. By proactively reducing visibility between suspected predators and accounts showing minors, Instagram is responding to years of criticism about how its recommendation tools unintentionally helped child exploitation networks thrive.
Expanded child safety tools for Instagram users under 18
Teen accounts now benefit from tighter safety controls as Instagram rolls out default protections. These include automatic restrictions on direct messages from unknown users, filters that remove offensive comments, and new safety signals that show when an account was created. The platform has also introduced a convenient combined “report and block” button in DMs. These measures are designed to prevent potential grooming behaviors and allow teens to better assess who they are engaging with, especially when approached by unfamiliar accounts.
Meta responds to criticism over past algorithm misuse
Meta’s safety overhaul follows a 2023 lawsuit accusing Facebook and Instagram of creating a "marketplace for predators." Reports by The Wall Street Journal also revealed how Instagram’s algorithm had unintentionally promoted pedophile networks. Although Meta maintains that most adult-run accounts featuring kids — such as those run by parents or talent managers — are harmless, it has acknowledged the need for stricter controls. Features such as disabling subscriptions and digital gifts for these accounts were already in place and are now being strengthened with visibility restrictions and tighter messaging safeguards.
𝗦𝗲𝗺𝗮𝘀𝗼𝗰𝗶𝗮𝗹 𝗶𝘀 𝘄𝗵𝗲𝗿𝗲 𝗿𝗲𝗮𝗹 𝗽𝗲𝗼𝗽𝗹𝗲 𝗰𝗼𝗻𝗻𝗲𝗰𝘁, 𝗴𝗿𝗼𝘄, 𝗮𝗻𝗱 𝗯𝗲𝗹𝗼𝗻𝗴. We’re more than just a social platform — from jobs and blogs to events and daily chats, we bring people and ideas together in one simple, meaningful space.