Is Big Brother Listening? And Other Concerns.

What is Google Home?

On November 4th, Google released Google Home, a smart speaker and entertainment hub, and competitor to Amazon’s Echo. Users can ask Google Home questions typically asked of a search engine (e.g., how long to boil an egg) or request actions typically enacted on a smartphone (e.g., setting a timer or calendar reminder).

The more a user engages with one of these smart hubs, the more user data the device and its servers accumulate. Because of algorithms incorporating sophisticated machine learning and artificial intelligence, a smart-hub user who divulges more data likely receives more personalized, accurate, and relevant answers. This tradeoff of data for personalization is one in which users increasingly divulge significant amounts of information, including sensitive and identifying information.

What are the expected privacy concerns?

In short, the same concerns that exist for mobile devices are likely to exist with these devices (e.g. the threat of illegal wiretapping, hacking, attacks, and identity theft prevail). 

Who gets legal access to user data?

The first category of people who get access to user data is the users themselves. These devices respond to any voice that says its “wake word.” For Google Home, the wake word is “OK Google.” For Echo, it’s “Alexa.” Although some folks are concerned that the devices are “always listening,” and certainly the devices have such capability (as do our smartphones, which also have microphones and Internet connection), at this point, these smart hubs only begin recording data a fraction of a second before processing the voiced wake word. Users can review—as well as choose to delete—their recorded search history, but are warned that doing so could “degrade [their] experience.”

The Service Providers and their third parties are the second category that enjoys access to user data. Certainly, tech giants benefit from monetizing user data to the extent they believe users will tolerate. For example, Amazon will likely encourage Echo users to buy and sell goods in the Amazon marketplace just by chatting with their anthropomorphic “friend,” Alexa. Google will likely utilize Google Home as a platform to serve personalized ads or as a research tool to better personalize them.

Of course, these companies are wary to overstep users’ boundaries of comfort and trust. For example, Google promises in its Privacy Policy that “When showing you tailored ads, we may associate an identifier from cookies or similar technologies with topics such as ‘Cooking and Recipes’ or ‘Air Travel,’ but not with sensitive categories” (i.e., “personal information relating to confidential medical facts, racial or ethnic origins, political or religious beliefs or sexuality”).

These companies carefully safeguard user data from outsiders. After Edward Snowden’s disclosures in 2013, Apple, Google, and other tech giants rapidly advanced their encryption technology to protect from leaks and hacks.

These companies don’t want the perception that they are peddling off sensitive data. For example, because Samsung sends voice data received by its Smart TV to a third party for speech-to-text conversion, the company cautioned users to be wary of what they say in front of the TV. The warning raised a media firestorm of indignation and paranoia in February 2015, requiring Samsung to clarify that Samsung does “not retain voice data or sell it to third parties.”

And because user trust is essential for their continued success, it’s unlikely these tech giants will peddle off sensitive, disaggregated data any time soon. After all, third parties may not take the same precautions nor spare the same expense to protect and encrypt sensitive user information. So, Google vouches in its current privacy policy to only share personally identifiable information if the user consents or it’s for internal processing or legitimate legal reasons. But it says it may aggregate and share non-personally identifiable information with third parties.

The third and final category is the government. In 2014, FBI director James Comey urged citizens to consider the possibility of companies coding a “backdoor” into their encrypted products, thus facilitating legal government wiretapping and seizure of stored data. In a joint statement released in 2015, security experts declared their opposition to enabling government access to encrypted communication.

Recently, the DOJ filed a motion to compel Apple to code new software that facilitated unlocking an iPhone the FBI had recovered from one of the shooters in the San Bernardino shooting in December 2015. Apple protested this solution, claiming it was out of the bounds of the All Writs Act. Apple worried about a slippery slope: if it’s forced to unlock a password today, then might it be forced to covertly turn on and access a user’s camera and microphone tomorrow? The FBI ultimately purchased iPhone access through another avenue, and the DOJ asked the judge to vacate the order.

What are other policy concerns? 

Tech leaders, pushing back, argue that enabling backdoor access will harm user privacy and undermine user security. If backdoors were readily available, the domestic government might be tempted to enact increased and possibly illegal surveillance. Moreover, weaker encryption increases vulnerability to attacks, hacks, or theft from malicious hackers, identity thieves, and foreign governments.

Tech leaders also argue that preventing backdoor access will not lead our government to “go dark” (i.e., fail to obtain necessary surveillance information), a concern expressed by Comey. Apparently, in 1993, the NSA invented the Clipper chip, an encryption telecommunications device with a built-in backdoor. Then, too, tech leaders resisted. Nonetheless, law enforcement officers have increasingly obtained clearance from the legislature and courts to wiretap and obtain information from telecommunications. Not to mention, with a court-approved warrant or subpoena, law enforcement may still be able to obtain phone data from other approaches (e.g., from the actual service provider or compelling decryption from the user).

So?

We are well on our way to welcoming artificial intelligence into our homes. If tech giants have anything to say about it, user data in their clouds and money in their pockets is in our future, but 1984 is not.

Comments are closed.