If you ask any marketing expert, customer engagement is simply about talking to the customer and reeling them in when the time’s right. This means being there for the user whenever they look for you throughout their lifecycle and therein lies the trick: How can you be sure you’re there at all times and especially when it matters most to the customer?
Through our preview journey in the past two years, we have learned a lot from interacting with thousands of customers undergoing digital transformation. We highlighted some of our customer stories (such as UPS, Equadex, and more) in our general availability announcement. This post covers conversational AI in a nutshell using Azure Bot Service and LUIS, what we’ve learned so far, and dive into the new capabilities. We will also show how easy it is to get started in building a conversational bot with natural language.
Let’s take a weather chat bot as an example to examine the capabilities of Scripted and Structured chatbots. The question “Will it rain on Sunday?” can be easily answered. However, if there is no programming for the question “Will I need an umbrella on Sunday?” then the query will not be understood by the chat bot. This is the common limitation with scripted and structured chatbots. However, in all cases, a conversational bot can only be as intelligent as the programming it has been given.
Multinational Naive Bayes is the classic algorithm for text classification and NLP. For an instance, let’s assume a set of sentences are given which are belonging to a particular class. With new input sentence, each word is counted for its occurrence and is accounted for its commonality and each class is assigned a score. The highest scored class is the most likely to be associated with the input sentence.
Evie's capacities go beyond mere verbal or textual interactions; the AI utilised in Evie also extends to controlling the timing and degree of facial expressions and movement. Her visually displayed reactions and emotions blend and vary in surprisingly complex ways, and a range of voices are delivered to your browser, along with lip synching information, to bring the avatar to life! Evie uses Flash if your browser supports it, but still works even without, thanks to our own Existor Avatar Player technology, allowing you to enjoy her to the full on iOS and Android.
We then ran a second test with a very specific topic aimed at answering very specific questions that a small segment of their audience was interested in. There, the engagement was much higher (97% open rate, 52% click-through rate on average over the duration of the test). Interestingly, drop-off went wayyy down there. At the end of this test, only 0.29% of the users had unsubscribed.

Utility bots solve a user's problem, whatever that may be, via a user-prompted transaction. The most obvious example is a shopping bot, such as one that helps you order flowers or buy a new jacket. According to a recent HubSpot Research study, 47% of shoppers are open to buying items from a bot. But utility bots are not limited to making purchases. A utility bot could automatically book meetings by scanning your emails or notify you of the payment subscriptions you forgot you were signed up for.
In a bot, everything begins with the root dialog. The root dialog invokes the new order dialog. At that point, the new order dialog takes control of the conversation and remains in control until it either closes or invokes other dialogs, such as the product search dialog. If the new order dialog closes, control of the conversation is returned back to the root dialog.

There is no one right answer to this question, as the best solution will depend upon the specifics of your scenario and how the user would reasonably expect the bot to respond. However, as your conversation complexity increases dialogs become harder to manage. For complex branchings situations, it may be easier to create your own flow of control logic to keep track of your user's conversation.
To keep chatbots up to speed with changing company products and services, traditional chatbot development platforms require ongoing maintenance. This can either be in the form of an ongoing service provider or for larger enterprises in the form of an in-house chatbot training team.[38] To eliminate these costs, some startups are experimenting with Artificial Intelligence to develop self-learning chatbots, particularly in Customer Service applications.

Eventually, a single chatbot could become your own personal assistant to take care of everything, whether it's calling you an Uber or setting up a meeting. Or, Facebook Messenger or another platform might let a bunch of individual chatbots to talk to you about whatever is relevant — a chatbot from Southwest Airlines could tell you your flight's delayed, another chatbot from FedEx could tell you your package is on the way, and so on.
Natural Language Processing (NLP) is the technological process in which computers derive meaning from natural human inputs. NLP-Based Conversational Bots are machine learning bots that exploit the power of artificial intelligence, which gives them a “learning brain.” These types of conversational bots have the ability to understand natural language, and do not require specific instructions to respond to questions as observed in types of chatbots such as Scripted and Structured Conversational Bots.
Chatting with a bot should be like talking to a human that knows everything. If you're using a bot to change an airline reservation, the bot should know if you have an unused credit on your account and whether you typically pick the aisle or window seat. Artificial intelligence will continue to radically shape this front, but a bot should connect with your current systems so a shared contact record can drive personalization.
Utility bots solve a user's problem, whatever that may be, via a user-prompted transaction. The most obvious example is a shopping bot, such as one that helps you order flowers or buy a new jacket. According to a recent HubSpot Research study, 47% of shoppers are open to buying items from a bot. But utility bots are not limited to making purchases. A utility bot could automatically book meetings by scanning your emails or notify you of the payment subscriptions you forgot you were signed up for.
What does the Echo have to do with conversational commerce? While the most common use of the device include playing music, making informational queries, and controlling home devices, Alexa (the device’s default addressable name) can also tap into Amazon’s full product catalog as well as your order history and intelligently carry out commands to buy stuff. You can re-order commonly ordered items, or even have Alexa walk you through some options in purchasing something you’ve never ordered before.
Chatbots have come a long way since then. They are built on AI technologies, including deep learning, natural language processing and  machine learning algorithms, and require massive amounts of data. The more an end user interacts with the bot, the better voice recognition becomes at predicting what the appropriate response is when communicating with an end user.
Artificial Intelligence is currently being deployed in customer service to both augment and replace human agents - with the primary goals of improving the customer experience and reducing human customer service costs. While the technology is not yet able to perform all the tasks a human customer service representative could, many consumer requests are very simple ask that sometimes be handled by current AI technologies without human input.
A very common request that we get is people want to practice conversation, said Duolingo's co-founder and CEO, Luis von Ahn. The company originally tried pairing up non-native speakers with native speakers for practice sessions, but according to von Ahn, "about three-quarters of the people we try it with are very embarrassed to speak in a foreign language with another person."
We then ran a second test with a very specific topic aimed at answering very specific questions that a small segment of their audience was interested in. There, the engagement was much higher (97% open rate, 52% click-through rate on average over the duration of the test). Interestingly, drop-off went wayyy down there. At the end of this test, only 0.29% of the users had unsubscribed.

Niki is a personal assistant that has been developed in India to perform an impressively wide variety of tasks, including booking taxis, buses, hotels, movies and events, paying utilities and recharging your phone, and even organizing laundry pickup and delivery. The application has proven to be a huge success across India and won the Deep Tech prize at the 2017 AWS Mobility Awards.


Before you even write a single line of code, it's important to write a functional specification so the development team has a clear idea of what the bot is expected to do. The specification should include a reasonably comprehensive list of user inputs and expected bot responses in various knowledge domains. This living document will be an invaluable guide for developing and testing your bot.
[…] But how can simple code assimilate something as complex as speech in only the span of a handful of years? It took humans hundreds of generations to identify, compose and collate the English language. Chatbots have a one up on humans, because of the way they dissect the vast data given to them. Now that we have a grip on the basics, we’ll understand how chatbots work in the next series. […]
Expecting your customer care team to be able to answer every single inquiry on your social media profiles is not only unrealistic, but also extremely time-consuming, and therefore, expensive. With a chatbot, you're making yourself available to consumers 24 hours a day, seven days a week. Aside from saving you money, chatbots will help you keep your social media presence fresh and active.

Simplified and scripted. Chatbot technology is being tacked on to the broader AI message, and while it’s important to note that machine learning will help chatbots get better at understand and responding to questions, it’s not going to make them the conversationalists we dream them to be. No matter what the marketing says, chatbots are entirely scripted. User says x, chatbot responds y.
Not integrated. This goes hand-in-hand with the contextual knowledge, but chatbots often suffer from “death by data silo” where their access to data is limited. If a chatbot is “chatting with” a customer, they not only need to access the contextual data of their customer but also have access to every place where the answer to the customer’s question may reside. Product documentation site, customer community, different websites are all places where that answer can be.

Operator calls itself a “request network” aiming to “unlock the 90% of commerce that’s not on the internet.” The Operator app, developed by Uber co-founder Garrett Camp, connects you with a network of “operators” who act like concierges who can execute any shopping-related request. You can order concert tickets, get gift ideas, or even get interior design recommendations for new furniture. Operator seems to be positioning itself towards “high consideration” purchases, bigger ticket purchases requiring more research and expertise, where its operators can add significant value to a transaction.
“I’ve seen a lot of hyperbole around bots as the new apps, but I don’t know if I believe that,” said Prashant Sridharan, Twitter’s global director of developer relations. “I don’t think we’re going to see this mass exodus of people stopping building apps and going to build bots. I think they’re going to build bots in addition to the app that they have or the service they provide.”
Your first question is how much of it does she want? 1 litre? 500ml? 200? She tells you she wants a 1 litre Tropicana 100% Orange Juice. Now you know that regular Tropicana is easily available, but 100% is hard to come by, so you call up a few stores beforehand to see where it’s available. You find one store that’s pretty close by, so you go back to your mother and tell her you found what she wanted. It’s $3 and after asking her for the money, you go on your way.
How far are we from building systems with commonsense? One often-heard answer is: not in the near future, while the realistic answer is: we don’t know. Last year, I spent some time trying to build a system that can do better than an information retrieval baseline in taking fourth-grade science exam (which still has a ways to go to gain a passing score of 65%). I failed hard. Here’s an example to get a sense of the difficulty of these questions.
For every question or instruction input to the conversational bot, there must exist a specific pattern in the database to provide a suitable response. Where there are several combinations of patterns available, and a hierarchical pattern is created. In these cases, algorithms are used to reduce the classifiers and generate a structure that is more manageable. This is the “reductionist” approach—or, in other words, to have a simplified solution, it reduces the problem.
What if you’re creating a bot for a major online clothing retailer? For starters, the bot will require a greeting (“How can I help you?”) as well as a process for saying its goodbyes. In between, the bot needs to respond to inputs, which could range from shopping inquiries to questions about shipping rates or return policies, and the bot must possess a script for fielding questions it doesn’t understand.
Tay was built to learn the way millennials converse on Twitter, with the aim of being able to hold a conversation on the platform. In Microsoft’s words: “Tay has been built by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians. Public data that’s been anonymised is Tay’s primary data source. That data has been modelled, cleaned and filtered by the team developing Tay.”

Despite the fact that ALICE relies on such an old codebase, the bot offers users a remarkably accurate conversational experience. Of course, no bot is perfect, especially one that’s old enough to legally drink in the U.S. if only it had a physical form. ALICE, like many contemporary bots, struggles with the nuances of some questions and returns a mixture of inadvertently postmodern answers and statements that suggest ALICE has greater self-awareness for which we might give the agent credit.
Designing for conversational interfaces represents a big shift in the way we are used to thinking about interaction. Chatbots have less signifiers and affordances than websites and apps – which means words have to work harder to deliver clarity, cohesion and utility for the user. It is a change of paradigm that requires designers to re-wire their brain, their deliverables and their design process to create successful bot experiences.
Regardless of which type of classifier is used, the end-result is a response. Like a music box, there can be additional “movements” associated with the machinery. A response can make use of external information (like weather, a sports score, a web lookup, etc.) but this isn’t specific to chatbots, it’s just additional code. A response may reference specific “parts of speech” in the sentence, for example: a proper noun. Also the response (for an intent) can use conditional logic to provide different responses depending on the “state” of the conversation, this can be a random selection (to insert some ‘natural’ feeling).
The main challenge is in teaching a chatbot to understand the language of your customers. In every business, customers express themselves differently and each group of a target audience speaks its own way. The language is influenced by advertising campaigns on the market, the political situation in the country, releases of new services and products from Google, Apple and Pepsi among others. The way people speak depends on their city, mood, weather and moon phase. An important role in the communication of the business with customers may have the release of the film Star Wars, for example. That’s why training a chatbot to understand correctly everything the user types requires a lot of efforts.

When we open our news feed and find out about yet another AI breakthrough—IBM Watson, driverless cars, AlphaGo — the notion of TODA may feel decidedly anti-climatic. The reality is that the current AI is not quite 100% turnkey-ready for TODA. This will soon change due to two key factors: 1) businesses want it, and 2) businesses have abundant data, the fuel that the current state-of-the-art machine learning techniques need to make AI work.

2. Flow-based: these work on user interaction with buttons and text. If you have used Matthew’s chatbot, that is a flow-based chatbot. The chatbot asks a question then offers options in the form of buttons (Matthew’s has a yes/no option). These are more limited, but you get the possibility of really driving down the conversation and making sure your users don’t stray off the path.
In so many ways I think chatbots are only just getting started – their potential is much underestimated at present. A big challenge is for chatbots mature so that they do more than is possible as a result of content entry wizards. If your content is created with a few easy clicks, it is unlikely to be much inspiration to anyone – and to date, despite much work in the field, the ability to emulated the creative open ended nature of real intellingence has seen only very partial success.
As with many 'organic' channels, the relative reach of your audience tends to decline over time due to a variety of factors. In email's case, it can be the over-exposure to marketing emails and moves from email providers to filter out promotional content; with other channels it can be the platform itself. Back in 2014 I wrote about how "Facebook's Likes Don't Matter Anymore" in relation to the declining organic reach of Facebook pages. Last year alone the organic reach of publishers on Facebook fell by a further 52%.
In one particularly striking example of how this rather limited bot has made a major impact, U-Report sent a poll to users in Liberia about whether teachers were coercing students into sex in exchange for better grades. Approximately 86% of the 13,000 Liberian children U-Report polled responded that their teachers were engaged in this despicable practice, which resulted in a collaborative project between UNICEF and Liberia’s Minister of Education to put an end to it.
Other companies explore ways they can use chatbots internally, for example for Customer Support, Human Resources, or even in Internet-of-Things (IoT) projects. Overstock, for one, has reportedly launched a chatbot named Mila to automate certain simple yet time-consuming processes when requesting for a sick leave.[24] Other large companies such as Lloyds Banking Group, Royal Bank of Scotland, Renault and Citroën are now using automated online assistants instead of call centres with humans to provide a first point of contact. A SaaS chatbot business ecosystem has been steadily growing since the F8 Conference when Zuckerberg unveiled that Messenger would allow chatbots into the app.[25]
How: this is a relatively simple flow to manage, and it could be one part of a much larger bot if you prefer. All you'll need to do is set up the initial flow within Chatfuel to ask the user if they'd like to subscribe to receive content, and if so, how frequently they would like to be updated. Then you can store their answer as a variable that you use for automation.
[In] artificial intelligence ... machines are made to behave in wondrous ways, often sufficient to dazzle even the most experienced observer. But once a particular program is unmasked, once its inner workings are explained ... its magic crumbles away; it stands revealed as a mere collection of procedures ... The observer says to himself "I could have written that". With that thought he moves the program in question from the shelf marked "intelligent", to that reserved for curios ... The object of this paper is to cause just such a re-evaluation of the program about to be "explained". Few programs ever needed it more.[8]
×