In the past auto dialers were used by sales and marketing companies, scammers as well as schools to do everything from plugging products, trying to skim money or even announcing snow days. While these were useful there was no mistaking the prerecorded message that you were hearing and no real interaction took place. In more recent years smart call handling systems allowed companies to deal with vocal responses from users before they even got to speak to a real human, but again these were very limited and quite obvious. This week though Google showcased its latest leap in useful AI and deep learning in the form of Duplex.
The idea seems like the next evolution of their assistant which is part and parcel of their android operating system and more and more present in other hardware and software. In its most basic, the Duplex system is your digital P.A. It will make calls for you based on the commands that you give to the Google Assistant. In other words, if you’re too busy (insert lazy / disliking of talking to people over the phone here) to make a call to book a restaurant or a doctor’s appointment, the AI will do this for you, and, in theory, the person on the other end of the phone won’t know that they’re talking to a machine.
Science fiction though it sounds like, this is fast becoming a reality and the demonstrations that Google showed during their showcase were more than impressive, they were down right scary. Scary in a good, I love technology and I’m not afraid of the rise of the machines kinda way… If you had come into the demonstration half way and weren’t aware that one side of the conversation was a computer you could easily have been fooled. Not only did the AI manage to sound convincingly human, it also reacted to challenges effortlessly and adjusted to the context of the situations it was presented with.
Using its deep learning and voice software, Google made calls to restaurants to book tables, check how busy somewhere was at specific times as well as booking appointments for haircuts with local salons, all with the person on the other end of the line being completely unaware that they were dealing with an artificial intelligence posing as the personal assistant for the user. What was really interesting and helped with the illusion was the use of pauses, hesitations and ‘hmm’s’ and ‘eh’s’ that a real person would make in the given situations. The speech seemed more natural simply by being less perfect.
There’s an obvious argument to be made that doing things like booking restaurants and hair cuts is becoming and increasingly online affair, but this is still years away from being the sole method of doing these things and getting the AI to start with these everyday tasks and perfecting it abilities before moving on to more complicated calls is not only interesting, but a good use and proof of the concept as well as the capabilities.
They also pointed out some fringe benefits for the opposite side, the businesses. By tying in the AI to the Google page and details about the business, 1 call to the company from the ‘assistant’ could update opening hours or details about the business without the need for human intervention. So, on a public holiday the AI could call and ask whoever is working what time they close and update this instantly to the Google places page. Presto, fewer calls to the business as people can see that the details have been updated that day to reflect the special occasion.
So the next question is, at what point do we start questioning whether our interactions are with a real person or with a machine? More importantly will we care? Having worked in, trained and managed call centers for years I know that people don’t really like to call these places in general, and actually, with the ability to ‘self-serve’ online, people are not only more reluctant to deal with people over the phone, they are also less equipped to do it. People’s telephone manner over the last few years has noticeably declined for the most part so maybe this is the answer. But will this lead to machine talking to machine?
Again, we seem to be delving into the realms of a sci-fi movie, but given that automation and machine interaction seems to be the way things are going it’s not a stretch at all to imagine that call centers will soon be phased out in favor of digital assistants. It’s not a new idea, and indeed something that has been tried and experimented with over the years, but now with the advances in deep learning this could soon be a reality.
Years ago, and even occasionally today, when I talk to people about these concepts the response I got was that there’s no substitution for dealing with a real person, but if we really can’t tell the difference then this is no longer true. Pushing this idea a little further, there are people I’ve talked to who say that if they were booking, for example, a doctors appointment, they’d be happier discussing their ailment with a machine than a real PA or secretary as they could avoid feeling like they’re being judged with an AI. It’s true that there maybe certain things that can only be dealt with by a human but it’s getting harder for me to think of specific examples.
Personally, for the most part, I hate talking over the phone and would opt to deal with an issue in real life with a person I can look in the eye, or remove the verbal component completely and use a few clicks or keystrokes online, but as a compromise this seems like an option for the phone shy and definitely a point of interest for business relying on large call center operations. For now I’m going to watch with fascination and very quickly begin to wonder if the next call I get from my service provider is human or are the digital representation of the companies message.