Data Loss Prevention (DLP) , Governance & Risk Management , Incident & Breach Response

Google Forced to Reveal Exposure of Private Data

Consumer Google+ Set for Shutdown; Google Hid the Data-Exposing Bug
Google Forced to Reveal Exposure of Private Data
Google's corporate headquarters in Mountain View, California (Photo: Google)

Google says a bug in an API for its Google+ social networking service exposed personal details for about 500,000 accounts, but it believes the data wasn't misused.

See Also: Secureworks Named a Major Player in the 2024 IDC MDR Marketscape

Google patched the bug in March but chose to not publicly disclose the problem, based on a recommendation made by its privacy and data protection office, writes Ben Smith, a Google fellow and vice president of engineering, in a blog post.

But the company was forced to acknowledge the incident after The Wall Street Journal on Monday reported on the data exposure. Citing anonymous sources and internal documents, the publication reported that Google feared it would be subjected to regulatory scrutiny and reputational damage if the details of the bug became known.

Google's decision to not disclose the data leak is likely to raise eyebrows because technology companies have faced increasing pressure and regulatory scrutiny over their data handling and privacy practices.

In its blog post about the data exposure, Google didn't specify where the affected users may have been based, or if it had alerted any countries' privacy regulators to the data exposure.

A Google spokeswoman declined to comment on the location of the users whose data was potentially exposed. "Every year, we send millions of notifications to users about privacy and security bugs and issues," she tells Information Security Media Group. "Whenever user data may have been affected, we go beyond our legal requirements and apply several criteria focused on our users in determining whether to provide notice."

In the case of this data exposure, she says that Google's internal review found that it would not be able to identify which users to inform, and also concluded that there was no "evidence of misuse" or actions that developers or users might take, as a result, and so the company chose to not issue a notification.

"The review did highlight the significant challenges in creating and maintaining a successful Google+ that meets consumers' expectations," she says. "Given these challenges and the very low usage of the consumer version of Google+, we decided to sunset the consumer version of Google+."

Consumer Google+ to Shut

As part of its data exposure announcement on Monday, Google announced that it will close the consumer version of Google+, which was designed as a competitor to Facebook but never gained traction, in 10 months.

Google cited low usage as one reason for the closure, as well as the intensive maintenance needed to keep it running. But the company does plan to continue to offer an enterprise version, because "we have many enterprise customers who are finding great value in using Google+ within their companies," Smith says in his blog post.

"Our review showed that Google+ is better suited as an enterprise product where co-workers can engage in internal discussions on a secure corporate social network," he says.

Buggy API

Explaining the data exposure, Smith says that the Google+ flaw was contained in a "People" API. Via this API, Google+ users could grant access to their profile data and to their friends' public information.

But if a user granted access to their profile data, the bug also exposed data that wasn't meant to be public. Exposed data may have included their name, email address, occupation, gender and age, Smith says. It did not include Google+ posts, messages, account data, phone numbers or content from G Suite - Google's cloud computing productivity software, which includes Docs, Drive, Calendar and other tools.

"We discovered and immediately patched this bug in March 2018. We believe it occurred after launch as a result of the API's interaction with a subsequent Google+ code change," Smith says.

Google says that for privacy reasons, it only kept log data for the People API for two weeks. Unfortunately, that means the company was unable to determine which users may have been affected by the bug, Smith says. But Google did run a two-week analysis, which showed that about 500,000 users during that period could have had their data exposed.

In addition, "our analysis showed that up to 438 applications may have used this API," Smith says.

Review: Third-Party Apps

Google found the flaw as part of Project Strobe, which it launched earlier this year to review third-party apps and developers' access to personal data.

Such access poses increasing concern. In their earlier days, Google, Facebook and Twitter attracted huge numbers of developers by offering easy access to user data, which can be used for targeted advertising and app personalization.

Over the past few years, however, there's been increasing concern that the companies' policies - and spotty governance - of how user data gets collected, used and shared may not be in users' best interests.

Facebook and Twitter for example, have ongoing reviews of third-party apps and how they use personal data. Both companies have also tightened their rules around what kinds of personal data apps are allowed to collect.

Cambridge Analytica Scandal

The movement to tighten and monitor usage policies has been further driven by revelations that a U.K.-based voter-profiling firm, Cambridge Analytica, ended up with as many as 87 million Facebook profiles. The firm acquired the data from a Cambridge University professor who had launched a personality quiz on Facebook around 2014 (see Facebook: 87M Accounts May Have Been Sent To Cambridge Analytica).

That app not only collected personal data on those who took the quiz, but also from their friends, which was allowed at the time under Facebook's rules and did not require the friends' permission. Around 2015, however, Facebook disallowed the collection of personal data from a user's friends.

Efforts to restrict developers and apps' access to user data are ongoing. Google, for example, is still updating its policies for its Gmail API, which will "limit the apps that may seek permission to access your consumer Gmail data," Smith says.

"Only apps directly enhancing email functionality - such as email clients, email backup services and productivity services (e.g., CRM and mail merge services) - will be authorized to access this data," he says. "Moreover, these apps will need to agree to new rules on handling Gmail data and will be subject to security assessments."

Google has also announced new data-handling changes for Android. "We are limiting apps' ability to receive Call Log and SMS permissions on Android devices, and are no longer making contact interaction data available via the Android Contacts API," Smith says.

Google says it will unbundle requests for access to data by Android apps, giving users finer control. (Source: Google)

Google is also planning to make permissions more granular. So rather than seeing just one window to grant an app a handful of permissions, each permission will be detailed in a separate pop-up.

Google Seeks Low Profile

As Google seeks to restrict access to user data to help ensure greater data security and privacy, however, its own data handling processes are sure to be called into question after the revelations that company officials chose to not disclose the March data exposure incident.

Smith explained the decision in terms of whether the data leak could have harmed anyone. "Our privacy and data protection office reviewed this issue, looking at the type of data involved, whether we could accurately identify the users to inform, whether there was any evidence of misuse, and whether there were any actions a developer or user could take in response," Smith says. "None of these thresholds were met in this instance."

Google's decision, however, is not likely to engender trust among its users, especially in this era of heightened sensitivity over data leaks and increasing questions about whether massive technology firms that gather, store and sell personal data are being both proactive and transparent in how they handle and safeguard the data.

As U.S. regulators and lawmakers have become intensely interested in data protection, privacy and election-related security, Google has attempted to maintain a low profile. But in its approach, Google executives' actions have sometimes had the opposite effect.

In early September, Google declined to send its CEO, Sundar Pichai, or one of its founders, Larry Page, to appear before the Senate Intelligence Committee, which was investigating election cybersecurity concerns. In response, the committee left an empty chair with a Google name card, in a symbolic gesture to highlight the company's absence.

Executive Editor Mathew Schwartz also contributed to this story.

About the Author

Jeremy Kirk

Jeremy Kirk

Executive Editor, Security and Technology, ISMG

Kirk was executive editor for security and technology for Information Security Media Group. Reporting from Sydney, Australia, he created "The Ransomware Files" podcast, which tells the harrowing stories of IT pros who have fought back against ransomware.

Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing, you agree to our use of cookies.