Can we trust voice activated technology?

With privacy at the forefront of everyone's minds, two tech-insiders go head to head on the issue of the day

The case for

Better data protection means we can start to focus on the benefits

by Benedikt von Thüngen, CEO, Speechmatics

With voice-activated tech, there’s an abnormally large elephant in the room: privacy. To date, there’s been an unfortunate trade-off between making use of tech and then tech making use of your data; the users becoming the misused.

Recent scandals have brought the importance of data protection to the fore and demonstrated that data collection is happening, breaking consumer trust and pushing both tech-savvies and tech-refuseniks to rethink their relationship with technology.

But there is good news. Regulation – and in particular this year’s General Data Protection Regulation, or GDPR – means we have more control over our data, and that future scandals should be less likely. Similarly, the changing landscape of compliance for tech companies and the push towards properly regulating content should allay people’s concerns.

‘We mustn’t forget the many benefits of voice-activated technology’

While we work on regulation, we mustn’t forget the many benefits of voice-activated technology. Voice tech has now reached a stage where it can change the entire power dynamic between us and our stuff. From fridges to cars to personal assistants, objects have moved on from their initial use value and soon enough will, quite literally, become part of our conversations.

Is this a good thing? Absolutely. Voice tech makes content and appliances available to wider audiences, breaking down barriers of location, literacy and physical impediments.

For instance, speech recognition technology can make social media subtitling cheaper and faster, helping content creators reach more people. The technology is also being applied to medicine where there are numerous applications – for example, sentiment analysis in virtual personal assistants could recognise a change in the pitch and tone of a voice. This allows for diagnoses of medical conditions such as early onset dementia or anxiety and depression far quicker than at any point in human history.

These possibilities make the future of voice-activated tech very exciting.

voice activated technology

The case against

We’re sacrificing our privacy for convenience we don’t need

by Matt Burgess, Senior Editor, Wired

There’s something very Orwellian about voice activated tech. A speaker that’s always listening, waiting to be woken by an artificial name, saving everything it hears. It’s creepy, right?

Yet millions of us have now installed these listening devices in our living rooms, kitchens and even bedrooms. In doing so, we’ve entrusted multi-billion dollar firms not to spy on us, while giving them the means to do so. We’re sitting ducks. Sure, there are benefits to voice activated devices. Voice commands can help us find out information quickly and entertain us, but things are already going wrong.

‘Cybersecurity researchers have discovered ways to manipulate these devices

In the same way that everything is hackable, cybersecurity researchers have discovered ways to manipulate these devices. The Amazon Echo has seen proof-of-concept hacks that turn it into listening posts and malicious software has been created for it. In one instance, a technical bug saw the entire conversation of one unsuspecting family sent to friends. The techniques haven’t been deployed in the real world yet, but there will be other potential vulnerabilities for those with malicious intent to take advantage of.

And then there are brands: a TV advert for Burger King triggered Google Home devices into reading out a description of its Whopper Burger by using the wake-words of ‘Okay Google’. Google also included adverts for the Disney film Beauty and the Beast when people asked for information about their day. They’re clever marketing gimmicks, but we haven’t asked for them, and we’re not warned our new toys will regurgitate fast food ads.

As companies behind these smart assistants work out how to make money from them, we’ll see more of this. There’ll be new types of voice activated devices, too. With each new smart appliance, boundaries will be pushed further, every time increasing the potential that these ‘harmless’ devices will be spying on us, snooping on our personal interests and intruding into our lives. Our personal and private spaces need to be kept as just that.