Governance & Risk Management , Incident & Breach Response , Managed Detection & Response (MDR)
Fitness Dystopia in the Age of Self-SurveillanceBig Brother, Meet Wearable Fitness Devices
Orwell got it wrong: We are less likely to surrender our privacy to a totalitarian state than we are to the lure of sharing holiday snaps, cat videos or the route and time you took for your latest cycling, jogging or kiteboarding outing, as captured by a wearable device and fitness app.
See Also: Attack Surface Management: Improve Your Attack Surface Visibility
Unfortunately, such data - published in aggregated, heat map form by Strava, a social network and app for tracking and sharing workouts - has revealed the internal layouts of secret government bases and may pose a risk to groups of users, such as humanitarian workers and members of the military (see Feel the Heat: Strava 'Big Data' Maps Sensitive Locations).
"You mean the world can see this?"
Of course, we're adults. Arguably, privacy and "too much information" tradeoffs are our choice to make.
But as the Strava saga shows, are we making informed choices? In particular, might an organization's privacy settings - "privacy zones," "enhanced privacy zones" and other such nomenclature - be sufficiently complicated that we don't know what choices would be best for ourselves? Might a data collector such as Strava not understand the societal impact of the collected data becoming public? And might our choices, in aggregate, put some of us individuals at greater risk?
'Who Hurt You?'
Strava, of course, wasn't the first product maker or service provider to turn big data gathered from customers toward marketing aims.
Last December, Netflix earned condemnation from some - and light applause from others - after it issued this Sunday evening tweet: "To the 53 people who've watched A Christmas Prince every day for the past 18 days: Who hurt you?"
To the 53 people who've watched A Christmas Prince every day for the past 18 days: Who hurt you?— Netflix US (@netflix) December 11, 2017
If you haven't heard of "A Christmas Prince," it's a romantic comedy about an aspiring young female journalist who goes undercover as a tutor to investigate a playboy prince. Apparently, hijinks ensue.
Market to Me, Baby
Even Netflix, however, didn't invent the "we'll use your big data in ways to gently rib you while earning a profit from you" shtick.
That marketing award goes to Spotify, which has been gently roasting subscribers' listening habits since 2016.
Again, some see ironic social commentary. Others, however, see an internet-industrial complex that threatens the foundations of our privacy and freedom.
What Big Data Hath Wrought
One upside, perhaps, is that these marketing campaigns demonstrate the types of data that businesses - whether we pay them or not - are amassing on us.
"This gives the public a kind of view into the ways that the major content companies are gathering and using our data," Jeffrey Chester, head of the nonprofit Center for Digital Democracy, which advocates for consumer privacy rights, told the New York Times last December. "Behind the ease of being able to access video and audio content are very sophisticated customer surveillance and analytics applications, and there's nothing funny about that."
Love it when my record player and VCR use their constant surveillance to insult me pic.twitter.com/5T7wgOxvF2— Parker Higgins, 1337 |-| (@xor) December 11, 2017
We're living in an age of self-surveillance.
In George Orwell's "1984," people were watched by their television. Now, many individuals carry one or more devices that track their location, offer audio and video capabilities, and readily broadcast their personal details across one or more sites. Many of those devices can also track their heart rate while they go about their workout or pursue more private activities.
But heat maps published by Strava and its ilk don't tell intelligence agencies anything they didn't already know, says Nick Feamster, a computer science professor at Princeton University and marathoner who extols the benefits such data can provide.
"The map is a public good that allows runners to plan safe routes, discover unfamiliar areas," he tweets.
Gathering and publishing such data, however, may have other, unintended consequences.
"Members of the public should take care when using apps such as Strava to ensure they do not inadvertently give away private information and locations," Sergeant Rob Danby of England's Humberside Police warned several years ago, saying he'd seen an increase in thefts of bicycles from sheds.
Such reports were not isolated. "Me and a few mates have been targeted and the bikes have been stolen two days ago as a result of tracking our GPS to our homes, if you look at one of your old rides and use satellite image, it will take you [to] your door," read a 2014 bicycling forum post with "Strava theft" as its subject line.
Many sites allow users to set "exclusion zones" in which their activity will not be reported. But others might still give it away. "Don't let all your mates ride to yours and set off from there as people will see their tracks converging on your house as it'll be outside their exclusion zones," read a response to the "Strava theft" post.
Our Data, Ourselves
Stephen Cobb, a senior security researcher at cybersecurity firm ESET, tells me that the Strava heat map debacle is like a flashback from the start of the World Wide Web, when exuberance sometimes overwhelmed caution.
"Technically it's cool to swipe the globe, zoom in on data trails. And the app's features for runners/cyclists are [very] cool," Cobb says. "But side effects recall early days of WWW: 'You mean the world can see this?'"