Next, identify the data sources that will enable the bot to interact intelligently with users. As mentioned earlier, these data sources could contain structured, semi-structured, or unstructured data sets. When you're getting started, a good approach is to make a one-off copy of the data to a central store, such as Cosmos DB or Azure Storage. As you progress, you should create an automated data ingestion pipeline to keep this data current. Options for an automated ingestion pipeline include Data Factory, Functions, and Logic Apps. Depending on the data stores and the schemas, you might use a combination of these approaches.

DevOps has emerged to be the mainstream focus in redefining the world of software and infrastructure engineering and operations over the last few years.DevOps is all about developing a culture of CAMS: a culture of automation, measurement, and sharing. The staggering popularity of the platform is attributed to the numerous benefits it brings in terms […]
Once your bot is running in production, you will need a DevOps team to keep it that way. Continually monitor the system to ensure the bot operates at peak performance. Use the logs sent to Application Insights or Cosmos DB to create monitoring dashboards, either using Application Insights itself, Power BI, or a custom web app dashboard. Send alerts to the DevOps team if critical errors occur or performance falls below an acceptable threshold.
Your bot can use other AI services to further enrich the user experience. The Cognitive Services suite of pre-built AI services (which includes LUIS and QnA Maker) has services for vision, speech, language, search, and location. You can quickly add functionality such as language translation, spell checking, sentiment analysis, OCR, location awareness, and content moderation. These services can be wired up as middleware modules in your bot to interact more naturally and intelligently with the user.
It won’t be an easy march though once we get to the nitty-gritty details. For example, I heard through the grapevine that when Starbucks looked at the voice data they collected from customer orders, they found that there are a few millions unique ways to order. (For those in the field, I’m talking about unique user utterances.) This is to be expected given the wild combinations of latte vs mocha, dairy vs soy, grande vs trenta, extra-hot vs iced, room vs no-room, for here vs to-go, snack variety, spoken accent diversity, etc. The AI practitioner will soon curse all these dimensions before taking a deep learning breath and getting to work. I feel though that given practically unlimited data, deep learning is now good enough to overcome this problem, and it is only a matter of couple of years until we see these TODA solutions deployed. One technique to watch is Generative Adversarial Nets (GAN). Roughly speaking, GAN engages itself in an iterative game of counterfeiting real stuffs, getting caught by the police neural network, improving counterfeiting skill, and rinse-and-repeating until it can pass as your Starbucks’ order-taking person, given enough data and iterations.
Consumers really don’t like your chatbot. It’s not exactly a relationship built to last — a few clicks here, a few sentences there — but Forrester Analytics data shows us very clearly that, to consumers, your chatbot isn’t exactly “swipe right” material. That’s unfortunate, because using a chatbot for customer service can be incredibly effective when done […]
While messaging and voice interfaces are central components, they fit into a larger picture of increasing infusion of technology into our daily lives, which in turn is unlocking new potential for brand-to-consumer interaction. The fact is, technology overall is becoming more deeply woven into our lives, and the entire ecosystem is enjoying tighter cohesion through the increasing availability and sophistication of APIs. Smart companies are finding new and innovative touch points with consumers that are contextual, relevant, highly personal, and yes, conversational. Commerce is becoming not only more conversational but more ubiquitous and seamlessly integrated into our lives, and the way we interact with brands will be forever changed as a result.
Magic, launched in early 2015, is one of the earliest examples of conversational commerce by launching one of the first all-in-one intelligent virtual assistants as a service. Unique in that the service does not even have an app (you access it purely via SMS), Magic promises to be able to handle virtually any task you send it — almost like a human executive assistant. Based on user and press accounts, Magic seems to be able to successfully carry out a variety of odd tasks from setting up flight reservations to ordering hard-to-find food items.

You may remember Facebook’s big chatbot push in 2016 –  when they announced that they were opening up the Messenger platform to chatbots of all varieties. Every organization suddenly needed to get their hands on the technology. The idea of having conversational chatbot technology was enthralling, but behind all the glitz, glamour and tech sex appeal, was something a little bit less exciting. To quote Gizmodo writer, Darren Orf:
Indeed, this is one of the key benefits of chatbots – providing a 24/7/365 presence that can give prospects and customers access to information no matter when they need it. This, in turn, can result in cost-savings for companies that deploy chatbots, as they cut down on the labour-hours that would be required for staff to manage a direct messaging service every hour of the week.
I argued that it is super hard to scale a one-trick TODA into a general assistant that helps the user getting things done across multiple tasks. An intelligence assistant is arguably expected to hold an informal chit-chat with the user. It is this area where we are staring into perhaps the biggest challenge of AI. Observe how Samantha introduces herself to Joaquin Phoenix’s Ted in the clip below:
A virtual assistant is an app that comprehends natural, ordinary language voice commands and carries out tasks for the users. Well-known virtual assistants include Amazon Alexa, Apple’s Siri, Google Now and Microsoft’s Cortana. Also, virtual assistants are generally cloud-based programs so they need internet-connected devices and/or applications in order to work. Virtual assistants can perform tasks like adding calendar appointments, controlling and checking the status of a smart home, sending text messages, and getting directions.

Online chatbots save time and efforts by automating customer support. Gartner forecasts that by 2020, over 85% of customer interactions will be handled without a human. However, the opportunites provided by chatbot systems go far beyond giving responses to customers’ inquiries. They are also used for other business tasks, like collecting information about users, helping to organize meetings and reducing overhead costs. There is no wonder that size of the chatbot market is growing exponentially.


Over the past year, Forrester clients have been brimming with questions about chatbots and their role in customer service. In fact, in that time, more than half of the client inquiries I have received have touched on chatbots, artificial intelligence, natural language understanding, machine learning, and conversational self-service. Many of those inquiries were of the […]
Each student learns and absorbs things at a different pace and requires a specific methodology of teaching. Consequently, one of the most powerful advantages of getting educated by a chatbot is its flexibility and ability to adapt to specific needs and requirements of a particular student. Chatbots can be used in a wide spectrum, be it teaching people how to build websites, learn a new language, or something more generic like teach children Math. Chatbots are capable of adapting to the speed at which each student is comfortable - without being too pushy and overwhelming.
Last, but not least coming in with the bot platform for business is FlowXO, which creates bots for Messenger, Slack, SMS, Telegraph and the web. This platform allows for creating various flexibility in bots by giving you the option to create a fully automated bot, human, or a hybrid of both. ChatBot expert Murray Newlands commented that "Where 10 years ago every company needed a website and five  years ago every company needed an app, now every company needs to embrace messaging with AI and chatbots."
Evie's capacities go beyond mere verbal or textual interactions; the AI utilised in Evie also extends to controlling the timing and degree of facial expressions and movement. Her visually displayed reactions and emotions blend and vary in surprisingly complex ways, and a range of voices are delivered to your browser, along with lip synching information, to bring the avatar to life! Evie uses Flash if your browser supports it, but still works even without, thanks to our own Existor Avatar Player technology, allowing you to enjoy her to the full on iOS and Android.
These are one of the major tools applied in machine learning. They are brain-inspired processing tools that actually replicate how humans learn. And now that we’ve successfully replicated the way we learn, these systems are capable of taking that processing power to a level where even greater volumes of more complex data can be understood by the machine.
Marketing teams are increasingly interested in leveraging branded chatbots, but most struggle to deliver business value. My recently published report, Case Study: Take A Focused And Disciplined Approach To Drive Chatbot Success, shows how OCBC Bank in Singapore is bucking the trend: The bank recently created Emma, a chatbot focused on home loan leads, which […]
Specialized conversational bots can be used to make professional tasks easier. For example, a conversational bot could be used to retrieve information faster compared to a manual lookup; simply ask, “What was the patient’s blood pressure in her May visit?” The conversational bot will answer instantly instead of the user perusing through manual or electronic records.

In a traditional application, the user interface (UI) is a series of screens. A single app or website can use one or more screens as needed to exchange information with the user. Most applications start with a main screen where users initially land and provide navigation that leads to other screens for various functions like starting a new order, browsing products, or looking for help.
Chatbots give businesses a way to deliver this information in a comfortable, conversational manner. Customers can have all their questions answered without the pressure or obligation that make some individuals wary of interacting with a live salesperson. Once they’ve obtained enough information to make a decision, a chatbot can introduce a human representative to take the sale the rest of the way.
Like most of the Applications, the Chatbot is also connected to the Database. The knowledge base or the database of information is used to feed the chatbot with the information needed to give a suitable response to the user. Data of user’s activities and whether or not your chatbot was able to match their questions, is captured in the data store. NLP translates human language into information with a combination of patterns and text that can be mapped in the real time to find applicable responses.
You can structure these modules to flow in any way you like, ranging from free form to sequential. The Bot Framework SDK provides several libraries that allows you to construct any conversational flow your bot needs. For example, the prompts library allows you to ask users for input, the waterfall library allows you to define a sequence of question/answer pair, the dialog control library allows you to modularized your conversational flow logic, etc. All of these libraries are tied together through a dialogs object. Let's take a closer look at how modules are implemented as dialogs to design and manage conversation flows and see how that flow is similar to the traditional application flow.
Context: When a NLU algorithm analyzes a sentence, it does not have the history of the user conversation. It means that if it receives the answer to a question it has just asked, it will not remember the question. For differentiating the phases during the chat conversation, it’s state should be stored. It can either be flags like “Ordering Pizza” or parameters like “Restaurant: ‘Dominos’”. With context, you can easily relate intents with no need to know what was the previous question.
Training a chatbot happens at much faster and larger scale than you teach a human. Humans Customer Service Representatives are given manuals and have them read it and understand. While the Customer Support Chatbot is fed with thousands of conversation logs and from those logs, the chatbot is able to understand what type of question requires what type of answers.
The Evie chatbot has had a huge impact on social media over the last few years. She is probably the most popular artificial personality on YouTube. She has appeared in several videos by PewdiePie, the most subscribed YouTuber in the world. This includes a flirting video with over 12 million views! Evie has been filmed speaking many different languages. She chats with Squeezie in French, El Rubius and El Rincón De Giorgio in Spanish, GermanLetsPlay and ConCrafter in German, NDNG - Enes Batur in Turkish, Stuu Games in Polish and jacksepticeye, ComedyShortsGamer and KSIOlajidebtHD in English. And that is a very small selection. Evie shares her database with Cleverbot, which is an internet star in its own right. Cleverbot conversations have long been shared on Twitter, Facebook, websites, forums and bulletin boards. We are currently working to give Evie some more artificial companions, such as the male avatar Boibot.
Consider why someone would turn to a bot in the first place. According to an upcoming HubSpot research report, of the 71% of people willing to use messaging apps to get customer assistance, many do it because they want their problem solved, fast. And if you've ever used (or possibly profaned) Siri, you know there's a much lower tolerance for machines to make mistakes.
“There is hope that consumers will be keen on experimenting with bots to make things happen for them. It used to be like that in the mobile app world 4+ years ago. When somebody told you back then… ‘I have built an app for X’… You most likely would give it a try. Now, nobody does this. It is probably too late to build an app company as an indie developer. But with bots… consumers’ attention spans are hopefully going to be wide open/receptive again!” — Niko Bonatsos, Managing Director at General Catalyst
To get started, you can build your bot online using the Azure Bot Service, selecting from the available C# and Node.js templates. As your bot gets more sophisticated, however, you will need to create your bot locally then deploy it to the web. Choose an IDE, such as Visual Studio or Visual Studio Code, and a programming language. SDKs are available for the following languages:
Intents: It is basically the action chatbot should perform when the user say something. For instance, intent can trigger same thing if user types “I want to order a red pair of shoes”, “Do you have red shoes? I want to order them” or “Show me some red pair of shoes”, all of these user’s text show trigger single command giving users options for Red pair of shoes.
What does the Echo have to do with conversational commerce? While the most common use of the device include playing music, making informational queries, and controlling home devices, Alexa (the device’s default addressable name) can also tap into Amazon’s full product catalog as well as your order history and intelligently carry out commands to buy stuff. You can re-order commonly ordered items, or even have Alexa walk you through some options in purchasing something you’ve never ordered before.
Whilst the payout wasn't huge within the early days of Amazon, those who got in early are now seeing huge rewards, with 38% of shoppers starting their buying journey within Amazon (source), making it the number one retail search engine. Some studies are suggesting that Amazon is responsible for 80% of e-commerce growth for publicly traded web retailers (source).
Do the nature of our services and size of our customer base warrant an investment in a more efficient and automated customer service response? How can we offer a more streamlined experience without (necessarily) increasing costly human resources?  Amtrak’s website receives over 375,000 daily visitors, and they wanted a solution that provided users with instant access to online self-service.

When considering potential uses, first assess the impact on resources. There are two options here: replacement or empowerment. Replacement is clearly easier as you don’t need to consider integration with existing processes and you can build from scratch. Empowerment enhances an existing process by making it more flexible, accommodating, accessible and simple for users.


Another option is to integrate your own custom AI service. This approach is more complex, but gives you complete flexibility in terms of the machine learning algorithm, training, and model. For example, you could implement your own topic modeling and use algorithm such as LDA to find similar or relevant documents. A good approach is to expose your custom AI solution as a web service endpoint, and call the endpoint from the core bot logic. The web service could be hosted in App Service or in a cluster of VMs. Azure Machine Learning provides a number of services and libraries to assist you in training and deploying your models.
On the other hand, early adoption can be somewhat of a curse. In 2011, many companies and individuals, myself included, invested a lot of time and money into Google+, dubbed to be bigger than Facebook at the time. They acquired over 10 million new users within the first two weeks of launch and things were looking positive. Many companies doubled-down on growing a community within the platform, hopeful of using it as a new and growing acquisition channel, but things didn't exactly pan out that way.
However, chatbots are not just limited to answering queries and providing basic knowledge. They can work as an aid to the teacher/instructor by identifying spelling and grammatical mistakes with precision, checking homework, assigning projects, and, more importantly, keeping track of students' progress and achievements. A human can only do so much, whereas a bot has virtually an infinite capacity to store and analyse all data.
1-800-Flowers’ 2017 first quarter results showed total revenues had increased 6.3 percent to $165.8 million, with the Company’s Gourmet Food and Gift Baskets business as a significant contributor. CEO Chris McCann stated, “…our Fannie May business recorded positive same store sales as well as solid eCommerce growth, reflecting the success of the initiatives we have implemented to enhance its performance.” While McCann doesn’t go into specifics, we assume that initiatives include the implementation of GWYN, which also seems to be supported by CB Insights’ finding: 70% of customers ordering through the chat bot were new 1-800-Flowers customers as of June 2016.
This is where most applications of NLP struggle, and not just chatbots. Any system or application that relies upon a machine’s ability to parse human speech is likely to struggle with the complexities inherent in elements of speech such as metaphors and similes. Despite these considerable limitations, chatbots are becoming increasingly sophisticated, responsive, and more “natural.”

For as long as I can remember, email has been a fundamentally important channel for a large majority of businesses. The ability to market products directly through a channel that scales up to an incredibly high ceiling is very attractive. The only problem is that it's costing more and more money to acquire email addresses from potential customers, and the engagement from email is getting worse and worse.

Your first question is how much of it does she want? 1 litre? 500ml? 200? She tells you she wants a 1 litre Tropicana 100% Orange Juice. Now you know that regular Tropicana is easily available, but 100% is hard to come by, so you call up a few stores beforehand to see where it’s available. You find one store that’s pretty close by, so you go back to your mother and tell her you found what she wanted. It’s $2, maybe $3, and after asking her for the money, you go on your way.


Need a Facebook bot? Well, look no further, as Chatfuel makes it easy for you to create your own Facebook and Telegram Chatbot without any coding experience necessary. It works by letting users link to external sources through plugins. Eventually, the platforms hope to open itself to third-party plugins, so anyone can contribute their own plugins and have others benefit from them.
Of course, each messaging app has its own fine print for bots. For example, on Messenger a brand can send a message only if the user prompted the conversation, and if the user doesn't find value and opt to receive future notifications within those first 24 hours, there's no future communication. But to be honest, that's not enough to eradicate the threat of bad bots.
I will not go into the details of extracting each feature value here. It can be referred from the documentation of rasa-core link that I provided above. So, assuming we extracted all the required feature values from the sample conversations in the required format, we can then train an AI model like LSTM followed by softmax to predict the next_action. Referring to the above figure, this is what the ‘dialogue management’ component does. Why LSTM is more appropriate? — As mentioned above, we want our model to be context aware and look back into the conversational history to predict the next_action. This is akin to a time-series model (pls see my other LSTM-Time series article) and hence can be best captured in the memory state of the LSTM model. The amount of conversational history we want to look back can be a configurable hyper-parameter to the model.
“Utility gets something done following a prompt. At a higher level the more entertainment-related chatbots are able to answer all questions and get things done. Siri and Cortana you can have small talk with, as well as getting things done, so they are much harder to build. They took years and years of giant company’s efforts. Different companies that don’t have those resources, like Facebook, will build more constrained utility bots.”
A very common request that we get is people want to practice conversation, said Duolingo's co-founder and CEO, Luis von Ahn. The company originally tried pairing up non-native speakers with native speakers for practice sessions, but according to von Ahn, "about three-quarters of the people we try it with are very embarrassed to speak in a foreign language with another person."

This importance is reinforced by Jacqueline Payne, Customer Support Manager at Paperclip Digital, who says ‘Customer service isn’t a buzzword. But too many businesses treat it like it is. As a viable avenue from which to lower customer acquisition costs and cultivate a loyal customer base, chat bots can play a pivotal role in driving business growth.’


Nowadays a high majority of high-tech banking organizations are looking for integration of automated AI-based solutions such as chatbots in their customer service in order to provide faster and cheaper assistance to their clients becoming increasingly technodexterous. In particularly, chatbots can efficiently conduct a dialogue, usually substituting other communication tools such as email, phone, or SMS. In banking area their major application is related to quick customer service answering common requests, and transactional support.
×