They listen out for wake words, and then send what follows to the cloud as an audio clip; when an answer arrives, in the form of another audio clip, they play it back.
Privacy and technology
For years the tech industry has dreamed of computing appliances that are considered unremarkable items of household machinery, like washing machines or fridges. The smart speaker has finally realised this promise. It can sit on a kitchen counter and summon the wonders of the internet without the need for swiping or typing. Using it is like casting a spell. Say the magic words and you can conjure up dodgy Eighties rock while up to your elbows in washing-up, or prove to your mum that Ronaldo has scored more goals than Messi. This hands-free convenience has a cost: the speakers are constantly listening out for commands. As with any advanced and apparently magical technology, however, myths quickly grow up about how they work.
So start with some myth-busting. As Alexa herself contends, smart speakers are not sending every utterance into the tech giants’ digital vaults. Despite their name, the devices are simple-minded. They listen out for wake words, and then send what follows to the cloud as an audio clip; when an answer arrives, in the form of another audio clip, they play it back. Putting all the smarts in the cloud means these speakers can be very cheap and acquire new skills as their cloud-based brains are continually upgraded. As part of this improvement, manufacturers (such as Amazon) store sound clips of queries, so they can be assessed by humans if necessary. But Amazon notes that users can delete these clips at any time. There’s always the mute button if you are worried about accidentally triggering your speaker and sending a clip into the cloud during a sensitive conversation. Users, the firm insists, are in control.
Not everyone is convinced by such assurances, however. What if hackers infiltrate the devices? Could governments require manufacturers to provide back doors? Are their makers using them to snoop on people and then exploiting that information to target online ads or offer them particular products? Some people refuse to let Alexa and Siri into the house.
If eavesdropping is your problem, eschewing smart speakers does not solve it. Smartphones, which people blithely carry around with them, are even worse. Spy agencies are said to be able to activate the microphone in such devices, which have even more sensors than smart speakers, including location-tracking GPS chips and accelerometers than can reveal when and how the phone is moving. And smartphones are, if anything, even more intimate than smart speakers. Few of Alexa’s users, after all, take her into bed with them.
At the same time as devices are getting cleverer (Amazon makes a microwave oven with built-in voice assistant), the big tech firms are expanding into adjacent areas such as shopping services, finance and entertainment. Over time this may mean their incentives to snoop and misuse data rise. But there will also be a countervailing incentive for manufacturers to differentiate themselves by making more privacy-friendly devices that promise not to store voice commands, or process more on the device rather than in the cloud (though this will be more expensive). The chief thing is that consumers should be able to choose how to balance convenience and privacy. If this magical technology is to reach its full potential, the tech giants need to do more to convince users that Alexa and her friends can be trusted.