Reduce costs: The potential to reduce costs is one of the clearest benefits of using a chatbot. A chatbot can provide a new first line of support, supplement support during peak periods or offer an additional support option. In all of these cases, employing a chatbot can help reduce the number of users who need to speak with a human. You can avoid scaling up your staff or offering human support around the clock.
There was a time when even some of the most prominent minds believed that a machine could not be as intelligent as humans but in 1991, the start of the Loebner Prize competitions began to prove otherwise. The competition awards the best performing chatbot that convinces the judges that it is some form of intelligence. But despite the tremendous development of chatbots and their ability to execute intelligent behavior not displayed by humans, chatbots still do not have the accuracy to understand the context of questions in every situation each time.
Through our preview journey in the past two years, we have learned a lot from interacting with thousands of customers undergoing digital transformation. We highlighted some of our customer stories (such as UPS, Equadex, and more) in our general availability announcement. This post covers conversational AI in a nutshell using Azure Bot Service and LUIS, what we’ve learned so far, and dive into the new capabilities. We will also show how easy it is to get started in building a conversational bot with natural language.
By 2022, task-oriented dialog agents/chatbots will take your coffee order, help with tech support problems, and recommend restaurants on your travel. They will be effective, if boring. What do I see beyond 2022? I have no idea. Amara’s law says that we tend to overestimate technology in the short term while underestimating it in the long run. I hope I am right about the short term but wrong about AI in 2022 and beyond! Who would object against a Starbucks barista-bot that can chat about weather and crack a good joke?
Polly may be a business-focused application, but the chatbot is designed to improve workplace happiness. Using surveys and feedback, managers can keep track of how effectively their teams are working and address problems before they escalate. This doesn’t only mean organizations will run more productively, but that workers will be happier in their jobs.
Conversational bots “live” online and give customers a familiar experience, similar to engaging an employee or a live agent, and they can offer that experience in higher volumes. Conversational bots offer scaling—or the capability to perform equally well under an expanding workload—in ways that human can’t, assisting businesses to reach customers in a way they couldn’t before. For one, businesses have created 24/7/365 online presence through conversational bots.

Through our preview journey in the past two years, we have learned a lot from interacting with thousands of customers undergoing digital transformation. We highlighted some of our customer stories (such as UPS, Equadex, and more) in our general availability announcement. This post covers conversational AI in a nutshell using Azure Bot Service and LUIS, what we’ve learned so far, and dive into the new capabilities. We will also show how easy it is to get started in building a conversational bot with natural language.
2. Flow-based: these work on user interaction with buttons and text. If you have used Matthew’s chatbot, that is a flow-based chatbot. The chatbot asks a question then offers options in the form of buttons (Matthew’s has a yes/no option). These are more limited, but you get the possibility of really driving down the conversation and making sure your users don’t stray off the path.
We then ran a second test with a very specific topic aimed at answering very specific questions that a small segment of their audience was interested in. There, the engagement was much higher (97% open rate, 52% click-through rate on average over the duration of the test). Interestingly, drop-off went wayyy down there. At the end of this test, only 0.29% of the users had unsubscribed.

ETL. The bot relies on information and knowledge extracted from the raw data by an ETL process in the backend. This data might be structured (SQL database), semi-structured (CRM system, FAQs), or unstructured (Word documents, PDFs, web logs). An ETL subsystem extracts the data on a fixed schedule. The content is transformed and enriched, then loaded into an intermediary data store, such as Cosmos DB or Azure Blob Storage.
Do the nature of our services and size of our customer base warrant an investment in a more efficient and automated customer service response? How can we offer a more streamlined experience without (necessarily) increasing costly human resources?  Amtrak’s website receives over 375,000 daily visitors, and they wanted a solution that provided users with instant access to online self-service.
All of these conversational technologies employ natural-language-recognition capabilities to discern what the user is saying, and other sophisticated intelligence tools to determine what he or she truly needs to know. These technologies are beginning to use machine learning to learn from interactions and improve the resulting recommendations and responses.
For every question or instruction input to the conversational bot, there must exist a specific pattern in the database to provide a suitable response. Where there are several combinations of patterns available, and a hierarchical pattern is created. In these cases, algorithms are used to reduce the classifiers and generate a structure that is more manageable. This is the “reductionist” approach—or, in other words, to have a simplified solution, it reduces the problem.
To get started, you can build your bot online using the Azure Bot Service, selecting from the available C# and Node.js templates. As your bot gets more sophisticated, however, you will need to create your bot locally then deploy it to the web. Choose an IDE, such as Visual Studio or Visual Studio Code, and a programming language. SDKs are available for the following languages:
Interface designers have come to appreciate that humans' readiness to interpret computer output as genuinely conversational—even when it is actually based on rather simple pattern-matching—can be exploited for useful purposes. Most people prefer to engage with programs that are human-like, and this gives chatbot-style techniques a potentially useful role in interactive systems that need to elicit information from users, as long as that information is relatively straightforward and falls into predictable categories. Thus, for example, online help systems can usefully employ chatbot techniques to identify the area of help that users require, potentially providing a "friendlier" interface than a more formal search or menu system. This sort of usage holds the prospect of moving chatbot technology from Weizenbaum's "shelf ... reserved for curios" to that marked "genuinely useful computational methods".
×