Facebook Fixing Messenger Kids App FlawSocial Media Firm Sending Out Notifications to 'Thousands' of Parents
Facebook is fixing a design flaw in its Messenger Kids app that allowed children under the age of 13 to enter into group chats with adults without their parents' permission. The social media company has notified "thousands" of parents this week that it's working on correcting the issue, a spokesperson says.
See Also: What is next-generation AML?
The Facebook Messenger Kids app was launched in 2017 with the purpose of ensuring child safety. It allows children between the ages of 6 and 12 to chat only with users who have been approved by their parents.
A bug in the app's group chat feature, however, blurred the privacy filters applied to one-on-one chats and allowed children to be part of groups and chat with its members who might not be approved by their parents, according to The Verge, which first reported on the issue Monday.
The Messenger Kids app, which is downloadable across all the major online app stores, is available only in the U.S., Canada, Mexico, Peru and Thailand. It has over 1 million installs from Google Play Store alone since it launched in December 2017. While it's not clear how many users have installed the app, Facebook sent out notifications to several thousand parents over the last several days.
A Facebook spokesperson tells Information Security Group that the issue arose from a technical error.
"We recently notified some parents of Messenger Kids account users about a technical error that we detected affecting a small number of group chats," the Facebook spokesperson says. "We turned off the affected chats and provided parents with additional resources on Messenger Kids and online safety."
Privacy at Stake
This latest controversy by Facebook could land the company in more trouble with government regulators, especially in the U.S.. A flaw could mean that the company potentially violated the Children's Online Privacy Protection Act, a 1998 law that places parents in control over what information is collected from their young children online, according to the Federal Trade Commission, which enforces the law.
In 2018, a coalition of 17 advocacy groups, filed a complaint with the FTC demanding action against Facebook, which they claim was flouting the Children's Online Privacy Protection Act by storing and collection information of children as young as age 5, according to the Parent Coalition for Student Privacy, an advocacy group.
"Any adult user can approve any Messenger Kids account, and testing confirmed that even a fictional 'parent' holding a brand-new Facebook account could immediately approve a child's account without proof of identity," according to a blog from the Parent Coalition for Student Privacy.
On Wednesday, the U.S. Justice Department and the FTC announced a $5 billion privacy settlement with Facebook.
Issues With Instagram
Facebook Messenger Kids is not the only application that the company has created for children that' being scrutinized by privacy advocates.
In June, Instagram, which is a subsidiary of Facebook, courted similar controversy, when a data scientist pointed out that the app lets users access contact details and email identification of minors who have a so-called business account (see: Instagram Shows Kids' Contact Details in Plain Sight ).
The issue has already set off alarm bells in the European Union, which has been stringent about General Data Protection Regulation non-compliance. Based on Stier's report, Ireland's Data Protection Commission announced on Monday that it is assessing the situation (see: Ireland Assessing Minors' Profiles on Instagram).
(Managing Editor Scott Ferguson contributed to this report.)