Slouching Toward 1984'Smart' TVs that Listen: What Could Go Wrong?
George Orwell's "1984" imagined an authoritarian society in which the government monitored citizens via their televisions.
See Also: You've Got BEC!
"The self-surveillance state can be deadly."
But security experts have long warned that if a device has a feature that can be disabled, a skilled hacker can often remotely enable that feature without revealing what they've done (see VoIP Phones: Eavesdropping Alert). There's a reason why leading security experts such as Mikko Hypponen keep a Band-Aid over their laptop's webcam.
When TVs Listen
Samsung, however, certainly isn't the first company to move in the direction of monitoring the users of its products. In May 2011, Verizon - which doesn't manufacture TVs, but which does have set-top boxes for cable TV subscribers - applied for a U.S. patent for creating a "detection zone" that would study what was happening near a television, then serve related advertising. Privacy, notably wasn't mentioned once in the 7,500-word, 14-page patent application.
What about other TV manufacturers - might their devices also be at risk? That's not clear, but Sony, for example, sells some Android-based TVs that have voice recognition features, and any such device is at risk if someone is able to remotely access the microphones built into these devices or infect the TV operating system with malware.
Compared with the authoritarian surveillance state envisioned by Orwell, however, reality for most of us seems much more banal. Think Internet of Things, selfie sticks, smart toys, smartphones, wearable computing devices and all of the other tools we now take for granted. Some pundits call this the self-surveillance state, on account of all of the information, images and videos that our devices now collect and transmit.
For some, however, the self-surveillance state can be deadly. To wit, last year the Intercept published documents from 2011 and 2012 leaked by former National Security Agency contractor Edward Snowden, revealing the existence of a big data program called Skynet. That's a reference to an artificial intelligence program in the "Terminator" movies that gained sentience and then tried to wipe out humanity by building legions of Arnold Schwarzeneggers.
In this case, however, the machine-learning program was designed to study travel patterns - including countries visited and days of the week - as well as behavior-based analytics, such as infrequent incoming calls, "excessive SIM or handset swapping" or frequent power-downs for the 55 million users of Pakistan's mobile telephone network, to try and spot terrorists. Related details about how the program may have been deployed since then remain state secrets.
But the program notes reinforce that the data handled by our Internet-connected or mobile telephony devices, as well as how, where and when we use devices, can be used to fingerprint people, track them, potentially serve them advertising - or coupons - or in the case of the U.S. drone program, remotely kill them.
At the time that the Skynet documents were created, however, the program's false positive rate was at least 0.18 percent. As Ars Technica reports, that means of the 55 million people being monitored, about 99,000 individuals would be mislabeled as being a potential "terrorist."
The lesson here? Watch what you say in front of your "smart" TV.