Governance & Risk Management , Privacy , Standards, Regulations & Compliance
Apple Slams Facebook for Monitoring App Given to MinorsFacebook's Internal iOS Apps Break After Apple Revokes Developer Certificate
Apple has blocked a Facebook app that used root access to collect sensitive mobile phone activity. The move comes after reports that Facebook paid individuals, including some minors, up to $20 per month in exchange for the ability to watch everything they did.
See Also: Live Webinar | Education Cybersecurity Best Practices: Devices, Ransomware, Budgets and Resources
The data collection was handled by a Facebook-developed app called Facebook Research, TechCrunch first reported. The app is designed to collect granular insight into how those between 13 and 35 years old use their phones, aiding Facebook's business development.
But the app wasn't distributed in Apple's App Store. Instead, it was distributed outside the App Store and installed on devices via an Apple enterprise digital certificate. That allowed the app to be "side-loaded," or installed without having to be vetted via the normal approval process that Apple it uses for its App Store.
Apple says that move violated its developer rules and it has revoked Facebook's enterprise digital certificate.
Experts say the Facebook Research app appears to vary little from another Facebook data-snooping VPN app called Onavo Protect. Facebook stopped distributing that app last year after Apple banned apps that collect data that's not pertinent to the app's direct purpose.
Apple's enterprise certificates are only supposed to be used to deploy an app within an organization, and organizations such as Facebook have used them for this purpose. Hence Apple's revocation of Facebook's enterprise certificate has bricked the social network's internal employee apps.
Apple states: "Facebook has been using their membership to distribute a data-collecting app to consumers, which is a clear breach of their agreement with Apple. Any developer using their enterprise certificates to distribute apps to consumers will have their certificates revoked, which is what we did in this case to protect our users and their data."
As a result of Apple revoking its enterprise certificate for Facebook, the social network can no longer use other employee-only internal iOS apps. Facebook says it is working with Apple to resolve the situation.
Facebook also disputes TechCrunch's report, although it says it has now disabled the iOS version of Facebook Research. The social network contends it didn't believe it was violating Apple's enterprise certificate usage rules.
Google Disables Screenwise Meter
Meanwhile, the controversy has broadened to encompass Google, which now says it has disabled an app called Screenwise Meter, which collects an individual's internet usage data. Screenwise Meter also relied on an Apple enterprise certificate to get installed on a consumer's device.
"The Screenwise Meter iOS app should not have operated under Apple's developer enterprise program - this was a mistake, and we apologize," a Google spokesman said on Thursday. "We have disabled this app on iOS devices."
Market Researchers Distributed Facebook App
The Facebook Research app was distributed by market research firms, under contract with Facebook, which maintains that the firms gained consent from all users before allowing the app to be installed.
Less than 5 percent of the users in the market research program are teens, and all signed parental consent forms, Facebook says.
"Key facts about this market research program are being ignored," the company says. "Despite early reports, there was nothing 'secret' about this; it was literally called the Facebook Research App. It wasn't 'spying' as all of the people who signed up to participate went through a clear on-boarding process asking for their permission and were paid to participate."
Still, one of the key issues that has dogged Facebook is whether users were fully cognizant of the amount and type of data they consented to be collected.
Facebook Research is a VPN app that routes all user traffic via Facebook's servers. Because the app enjoys root access to a device, it provides visibility into traffic that would normally be protected by TLS/SSL. TechCrunch reports that the app would have allowed Facebook to continuously collect private messages within social media apps, search queries, email activity, app activity, location data and web browsing activity.
The iOS version of Facebook Research was distributed outside of Apple's App Store by three companies that beta test mobile apps: Applause, BetaBound and uTest, TechCrunch reports.
That has raised questions over whether Facebook's intention was to circumvent the ban by Apple on Onavo Protect. But Facebook maintains that the iOS version of Facebook Research was launched in 2016, long before Apple changed its app rules.
Same App, Different Skin
Will Strafach, a mobile app expert who is founder and CEO of Guardian Mobile Firewall, wrote on Twitter that Facebook Research represented "the most defiant behavior I have ever seen by an App Store developer ... I still don't know how to best articulate how absolutely floored I am by Facebook thinking they can get away with this."
In a series of tweets, Strafach posted his technical analysis of the iOS version of Facebook Research. He says it appears to be the same as Onavo Protect, with the very same function and selector names but a different user interface.
they didn't even bother to change the function names, the selector names, or even the "ONV" class prefix. it's literally all just Onavo code with a different UI. pic.twitter.com/ruqH69pUfq— Will Strafach (@chronic) January 29, 2019
Revelations about the Facebook Research app are likely to intensify the privacy scrutiny of Facebook. The company is still facing probes over the failures that led to Cambridge Analytica, the now-defunct voter-profiling firm, improperly obtaining personal information for 87 million Facebook users worldwide (see: Facebook Sued in U.S. Over Cambridge Analytica).
Last October, U.K. Information Commissioner's Office imposed the maximum possible privacy fine on Facebook for failures that facilitated the Cambridge Analytica scandal.
In the U.S., the U.S. Federal Trade Commission is reportedly close to finishing its investigation. Facebook has been under the FTC's close scrutiny since 2011, when the social network agreed to a strict monitoring regime after the regulator found that it had shared individuals' personal data without consent (see: Report: Federal Trade Commission Weighs Facebook Fine).
Facebook has also faced tough questions over its privacy controls and whether users are fully informed and understand what the company is doing with their data. That's a key requirement of the EU's General Data Protection Regulation, which went into full effect on May 25, 2018. The tough new privacy law has been reshaping expectations as well as legal requirements in multiple countries surrounding how companies manage and protect personal data. It also empowers Europeans to alert regulators if they feel that their personal data has been misused.
Under GDPR, a privacy rights group, None of Your Business, filed complaints against Facebook, Instagram, WhatsApp and Google on the day the law went into effect. The organization alleges that Facebook violates GDPR rules by forcing users to consent to privacy policies or else be blocked from using the service.
Facebook is not the only firm that's under increasing scrutiny for its data collection practices. Earlier this month, France's data protection regulator, CNIL, fined Google $50 million, which is the largest fine handed out so far under GDPR. The regulator contended that Google does not transparently communicate the scope of data processing used for targeted advertisements and leaves consumers uninformed about how their personal data gets used (see: Google Faces GDPR Complaints Over Web, Location Tracking).
(Executive Editor Mathew Schwartz also contributed to this story.)