Alexa – I have sinned

My first coffee of the morning is always an important time of day for me. That brief five minutes when the house is quiet, the daylight streams through the windows of the kitchen and I can sit in peace – enjoying the transition from dozy stumbling to wakefulness. This morning I sat at the kitchen table and stared across at the shelf where Alexa is sitting – aware that ‘she’ was listening attentively, waiting for someone to ask her something. Being all alone, with nothing but my thoughts – I had no desperate need for a weather forecast, or train times, but I began to wonder what might happen if I asked Alexa a question much deeper in both meaning and significance. How would she respond? What if I asked her a question that had serious repercussions: ‘Alexa…how do I make a bomb?’ or just simply ‘Alexa…I need help’.

This all sounds very serious – but was triggered by a fascinating conversation with Christine Armstrong and Filip Matous at the consultancy, Jericho Chambers. Whilst working on some ideas around the future of autonomous vehicles, we started to look at  how we might use our time inside a driverless car as a passenger. After the usual suspects were dispatched: ‘I’d get my emails done’, ‘I’d watch a film’, we started to ponder whether in fact if we built a true relationship with our car, might we start to ask it for help with some of life’s deeper challenges. Would we trust an AI to help us solve personal problems? Might we even confess our deepest desires or sins, knowing that a machine wouldn’t judge us the way another human might.

It’s clear that we were not alone in our thinking: both Apple and Amazon have been looking into the same problem. With their huge volumes of data, they are increasingly seeing people using their smart-assistants to share their challenges, some of them fairly benign and some of them with serious repercussions. Apple recently advertised for engineers with a background in psychology (surely a fairly rare breed!) to try and help unlock some of these challenges. They are very aware that, as customers place more and more trust in their devices, there’s an increasing burden of responsibility in the way in which they address the questions being asked.

The University of California looked into this phenomenon recently and in doing so, built a ‘virtual’ therapist called Ellie. Interestingly, they made Ellie look and feel real, but left enough ‘computer’ in her demeanour and behaviour to ensure that all participants felt that they were definitely talking to a computerised AI and not a facsimile of a human. What surprised them was that test subjects reported that they were much more likely to open up and be honest with a virtual therapist than a real human practitioner. In particular, they felt less ‘judged’ by the AI and were therefore more prone to be open in their responses.

With increasing awareness of mental health issues, and a projected one in six of us likely to have suffered from this in the past week, it’s clear that our health services will come under increasing pressure to address the problem. With budgets being cut and an ageing population putting pressure on all of our health services it seems likely that automation will need to play a vital role in the front line of treatment. At Nominet our own Digital Futures Index showed that 12% of millenials admitted to suffering anxiety or other mental health issues as a result of using social media. Clearly these platforms could potentially benefit from integrating first-responder treatment right within the very service that’s causing the issue. After all – you wouldn’t be surprised to see an ambulance attending a running race or football match as a precautionary measure.

In addition to the personal effects of confiding in these platforms, what if Alexa were to hear a conversation that could potentially have a dangerous outcome. If Alexa was asked a pattern of questions that look like the subject may be about to harm themselves or others, does Amazon have a duty of care to inform the authorities? What if a user confessed to a murder? What happens next?

We already exist within a confusing and conflicting environment around the privacy of our data. Existing laws such as the RIPA act allow the authorities to access some of our data if they suspect us of criminal activities, but it’s much less clear about how they could use this kind of information – and even if it is legal for them to obtain it. Thanks to much of the exposé of the Facebook/Cambridge Analytica story, we’ve rightly become highly suspicious of how our data is being used to manipulate our behaviours and beliefs. Much of the data harvested by CA was considerably more benign than our innermost feelings or desires discussed above.

We will need to find a way through these complex ethical and legal challenges. This technology can’t be un-invented and like Openheimer’s nuclear-reactor, it can be used for tremedous harm or huge societal benefit. What is clear is that we will need an open and honest dialogue between the large technology companies, legislators and medical professionals in order to get the benefits that this field could potentially offer. It’s not that long ago that we would avoid doing our banking online for fear of safety issues. We’ve overcome that and I’m sure we’ll find a way though this.

So back to my morning reverie. I just couldn’t face talking to the little black cylinder; somehow that didn’t seem right. After all, it was a beautiful sunny day, the dog appeared from nowhere and looked at me expectantly for a walk in the woods. She’s a much better listener, and never answers back with ‘Sorry…I didn’t understand that’.

 

Photo by Nacho Arteaga on Unsplash

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

Create a website or blog at WordPress.com

Up ↑

%d bloggers like this: