It’s hard to swing a $400 juicer in Silicon Valley these days without hitting a chatbot. Advances in artificial intelligence have enabled these talkative assistants to become a reality, and they’re now cropping up in many different forms.

SwissCognitiveFacebook has greatly improved its chatbot game in Messenger, and everyone from Mastercard to Maroon 5 are climbing on board. In a way, voice-controlled chatbots drive personal assistants like Siri on our phones and Amazon Echo in our living rooms. It’s enough to make you believe the bots are taking over. Except they’re not, at least not yet. The technologies that drive chatbots, and those related to machine learning and AI in particular, need to advance before “conversation” becomes a standard interface. Some of the needs are obvious, like improved speech recognition, while others are more subtle, like the ability for chatbots to signal what services they have to offer. Here are some areas where these talkative bits of AI need to improve before they really take off.

Advances in AI and Natural Language Processing
Remember the early days of the web, when pages were a sea of flashing neon and blue links? That’s where chatbots are today. If bots are to reach ubiquity, people need to be able to ask questions and place orders using natural language.

Know your customer
A huge part of any AI implementation is understanding context. Much as marketing and sales are searching for that mythical 360-degree view of the customer, chatbots need to know more about the individuals they interact with — who they are, how they got here, what they’re looking for and what they did in the past.

Machines chatting with machines
The web is an amazingly interconnected place. Type any product into Google and you’re instantly connected to merchants that have the exact product you’re looking for in stock. Chatbots need to evolve in a similar way, so they can intelligently hand users off to each other and seamlessly take over a communication.

Illuminating what’s on offer
If I interact with an app or a web page, I can instantly see which services are available through links and other elements on the screen. Chatbots don’t have this visual language. When you talk to a chatbot, you’re going in with your eyes closed. What can I ask it? What does it do?  Interacting with a chatbot for the first time, people need to know.

Reading emotion
Chatbots will provide infinitely better service when they can read facial features and inflections in tone to understand the emotion of the person they’re communicating with. This is partly about simple customer service — if the user is becoming frustrated or angry, it may be time to hand the conversation off to a human. […]


Thank you for reading this post, don't forget to subscribe to our AI NAVIGATOR!