Instagram and its parent company Facebook are beginning to crack down on underage users. As per a new “operational” change to the social media empire’s policies, both Facebook and Instagram will be more proactive in locking the accounts of users who are suspected of being under the minimum age of 13. Previously, Facebook would only look into accounts that were specifically reported to be run by younger individuals. But now, the social network will begin locking accounts of any and all underage users that moderators find, regardless of why they were initially reported.
Accounts believed to be operated by individuals younger than 13 will be required to provide proof of age, likely by way of a government-issued ID, in order to re-establish control of the account. While the company does not require its users to provide any such identification upon signing up, it’s clear that the honor system in that regard isn’t working out so well.
While Facebook has gradually begun to cater more toward older demographics, this is certainly not the case for Instagram. Anecdotally speaking, at the very least, it seems that today’s youth are eager to begin their social media lives on Instagram, and it’s likely that these younger users make up a significant proportion of the app’s user base. Cracking down on these accounts, then, could have an effect both on user numbers as well as ad revenue.
The change was precipitated by a documentary put forth by the U.K.’s Channel 4 and Firecrest Films, in which an undercover journalist became a Facebook content reviewer via a third-party firm in Dublin, Ireland. One of the reviewers purported that their instructions were to ignore users who seemed to be underage, noting, “We have to have an admission that the person is underage. If not, we just like pretend that we are blind and that we don’t know what underage looks like.” The documentary also suggested that right-wing political pages were held to different standards when it came to deletion or suspension.
Facebook responded to these allegations, recently publishing a blog post that said that high-profile Pages and registered political groups are often subjected to another layer of review from actual Facebook employees. However, in the days since, Facebook has added, “Since the program, we have been working to update the guidance for reviewers to put a hold on any account they encounter if they have a strong indication it is underage, even if the report was for something else.”
While this doesn’t mean that all underage accounts will suddenly be shut down, it does mean that if a reviewer sees something, he or she is now far more likely to say something. The 13-year-old minimum is mandated by the U.S. Child Online Privacy Protection Act, which necessitates that digital companies obtain parental consent before collecting data about children.
So why now? The likely answer is that Facebook is now facing more scrutiny than ever in the wake of its high-profile Cambridge Analytica scandal (and now Crimson Hexagon), as well as a number of other controversies including Russian interference in the 2016 election, addiction to the social network, and failure to protect users from fake news and other problematic content. Consequently, Facebook has outlined a new moderation policy, telling TechCrunch, “There are certain forms of misinformation that have contributed to physical harm, and we are making a policy change which will enable us to take that type of content down. We will be begin implementing the policy during the coming months.”
Of course, Facebook will have its work cut out toeing the line between granting free speech and creating a safe and inclusive online environment. We’ll be sure to keep you abreast of its progress.