I admit it. I love my wearables. I love competing against my former CTO and out-walking him. (It may be a rare day, but it’s a good one.) I love that my mother, who lives thousands of miles away, has the ability to call for help via her Lifeline. It’s one less worry in a sea of worries.
But my love, and patience, is being tested. With all the Apple Watch enthusiasm, and the blockbuster early sales of the iPhone 6, I find that there is a growing pit in my stomach. I find myself not laughing nearly as hard at the South Park episode where the "Apple Police" pursue Kyle for clicking on the iTunes Terms of Service (TOS). When the grain of truth turns into a bushel, well, it’s just not funny.
During the last eight months, I have been working on a research project focused on the future of voice and communications. If there was one crystal clear finding thus far, it is that people loathe the idea of their voice being recorded -- particularly when they cannot opt out of it (e.g., customer service calls), delete it later or, worse yet, when they're unaware that it's even happening.
[Watch out for Trojans in the sky. Read: Dyre Straits: Why This Cloud Attack's Different.]
When it comes to privacy, people feel that their voice is a line in the sand. It’s one thing to take my metadata, but leave my voice the (insert expletive here) alone.
So why is it that Siri, the star of the Apple Watch, is allowed to record me unannounced? Search Apple’s terms of service, and you won't find details on the practice. Furthermore, Siri's FAQs fail to speak to voice recording at all. Thanks to Wired magazine, a reporter last year substantiated that Apple retains the recorded voice records for six months and then de-identifies the recordings and retains for another 18 months, purportedly for research purposes.
Although I cannot say what exactly "de-identifies" means, I can say this: Rest assured, your voice is being recorded and used to benefit Apple.
The violation here is subtle in interface, huge in practice. It sets a precedent for recording our words without the usual safeguards.
Part of the problem is that Apple and other Over The Top (OTT) players -- meaning companies that deliver audio, video, and other media over the Internet without the involvement of a traditional distributor (e.g., Multi-System Operator or MSO) -- lump voice into the general data bucket. But voice is not data. Voice is a part of our identity in a way that text can never claim. It feels to us like the most personal of our Personally Identifiable Information (PII).
Now you may love to hate your local telecom provider, but safeguarding voice is a near religion for insiders with roots in the phone business. In the US, there is a long history of various telecoms vigorously appealing warrants to keep consumer voice private. In Germany, which has the strictest privacy laws globally, Deutsche Telekom was ranked as the country’s most trusted brand among Internet providers and mobile providers.
We have laws in the US designed to protect consumers from unannounced recordings, and far stricter ones in the EU. So, how exactly is this unannounced recording possible?
Voice interfaces are new, and we don’t necessarily recognize the difference of talking to rather than through a device. Talking to a device feels like talking to a person. Since the vast majority of our human-to-human conversations are ephemeral, we tend to think of this new conversational model the same way. And yet, it is not. Siri is not a person, of course, so it's not covered under laws related to conversations. We should be treating “her” like a recording device. This is where our humanness bites us. It’s way too easy to anthropomorphize her, particularly when she answers our knock-knock jokes.
The way Apple benefits from our recorded voice has me worried about other boundaries the Apple Watch might cross. It has access to our vitals too. One particular vital gives me real pause: Heart Rate Variability (HRV). HRV is not only as unique as a fingerprint, but changes in it can signal elevated anxiety, Post Traumatic Stress Syndrome (PTSD), and, a pending heart attack.
Thinking back to my mother, the Apple Watch could be far more helpful than the Lifeline in preventing a catastrophic outcome by calling 911 long before she realizes she is in danger. This would be a massive leap forward for healthcare and telemedicine, which I am eager to embrace. And yet, it also means that our most personal health data is being transmitted from our watch to our iPhone over Bluetooth. And sadly, far too few people understand just how hackable Bluetooth is.
With the speed of technological change fast approaching a blur, legislation cannot possibly stay in sync. As individuals, we need to make our concerns known and demand transparency around privacy practices. Given the asymmetrical power relationship between Apple and other OTT players (I'm looking at you, Google) and its end users, providing clear and well-summarized privacy policies and Terms of Service is the very least they can do.
But since we are talking about identity here, I believe we need to go well beyond token policy improvements. We need the right to manage our own identity and ability to delete our data on our own terms. Google’s recent "Right to be Forgotten" mandate handed to it by the European Union court is a start in this direction. My key concern here is that this campaign will morph into privacy theater, which gives people the illusion of control without actually changing anything. (Pro tip: Deleting data off the Internet is very, very hard to do.)
Although the situation may feel overwhelming, let’s not pretend we are powerless. Bring Your Own Device (BYOD) has been a wakeup call as to the power of the end user. Continuing this theme, the emergence of Bring Your Own Identity (BYOID) holds a good deal of promise. BYOID started with some enterprises allowing users to log into their corporate networks with external credentials provided by LinkedIn, Facebook, even an external email account. This provides the end user with the relative ease of single sign-on and frees up the corporate helpdesk from password resets. By allowing users to take control of their credentials, this nascent trend has the potential to recalibrate the power asymmetry and eventually allow us to reclaim our identities. Or, on a less optimistic note, allow a handful of social network providers to be the arbiters of our identities. (Hello, Facebook.) Personally, I’m rooting for the former.
So while we start searching for tools (e.g., Duck Duck Go, Bitcoin) and techniques (Incognito mode on the Chrome browser) to help us reclaim some control over our identities, we need to put pressure on our legislators to catch up with the necessary regulations and laws. Many of the top OTT players are US corporations and can be compelled to take voice, at very least, as seriously as telecom companies have.
Before I end this rant, please don’t confuse me with some random anti-Apple hater. Not only do I live and work in an almost exclusively Apple environment, I'm a shareholder. I believe that Apple continues to set the bar on user experience and deserves credit for the concept of "user delight" being part of our common vocabulary. Thanks to their efforts, many product designers see this emotional state as the ultimate goal. We cannot ask for anything more than to be delighted as users, right?
And here, I disagree.
This newly emboldened end user wants more than delight. I want liberty. And while we are at it, stop calling me a "user." I’m not addicted to your products. I am, and for the time remain, your customer.
Today's endpoint strategies need to center on protecting the user, not the device. Here's how to put people first. Get the new User-Focused Security issue of Dark Reading Tech Digest today. (Free registration required.)E. Kelly Fitzsimmons is a well-known serial entrepreneur who has founded, led, and sold several technology startups. Currently, she is the co-founder and director of HarQen, named one of Gartner's 2013 Cool Vendors in Unified Communications and Network Systems and Services, ... View Full Bio