Google, the company with perhaps the greatest artificial intelligence chops and the biggest collection of data about you — both of which power effective bots — has been behind here. But it is almost certainly plotting ways to catch up. Google Now, its personal assistant system built within Android, serves many functions of the new wave of bots, but has had hiccups. The company is reportedly working on a chatbot that will live in a mobile messaging product and is experimenting with ways to integrate Now deeper with search.

Getting the remaining values (information that user would have provided to bot’s previous questions, bot’s previous action, results of the API call etc.,) is little bit tricky and here is where the dialogue manager component takes over. These feature values will need to be extracted from the training data that the user will define in the form of sample conversations between the user and the bot. These sample conversations should be prepared in such a fashion that they capture most of the possible conversational flows while pretending to be both an user and a bot.
Aside from being practical and time-convenient, chatbots guarantee a huge reduction in support costs. According to IBM, the influence of chatbots on CRM is staggering.  They provide a 99 percent improvement rate in response times, therefore, cutting resolution from 38 hours to five minutes. Also, they caused a massive drop in cost per query from $15-$200 (human agents) to $1 (virtual agents). Finally, virtual agents can take up an average of 30,000+ consumers per month.
Back in April, National Geographic launched a Facebook Messenger bot to promote their new show about the theoretical physicist's work and personal life. Developed by 360i, the charismatic Einstein bot reintroduced audiences to the scientific figure in a more intimate setting, inviting them to learn about the lesser-known aspects of his life through a friendly, natural conversation with the man himself.
However, as irresistible as this story was to news outlets, Facebook’s engineers didn’t pull the plug on the experiment out of fear the bots were somehow secretly colluding to usurp their meatbag overlords and usher in a new age of machine dominance. They ended the experiment due to the fact that, once the bots had deviated far enough from acceptable English language parameters, the data gleaned by the conversational aspects of the test was of limited value.
Chatbots can reply instantly to any questions. The waiting time is ‘virtually’ 0 (see what I did there?). Even if a real person eventually shows up to fix the issues, the customer gets engaged in the conversation, which can help you build trust. The problem could be better diagnosed, and the chatbot could perform some routine checks with the user. This saves up time for both the customer and the support agent. That’s a lot better than just recklessly waiting for a representative to arrive.
It didn’t take long, however, for Turing’s headaches to begin. The BabyQ bot drew the ire of Chinese officials by speaking ill of the Communist Party. In the exchange seen in the screenshot above, one user commented, “Long Live the Communist Party!” In response, BabyQ asked the user, “Do you think that such a corrupt and incompetent political regime can live forever?”
“We believe that you don’t need to know how to program to build a bot, that’s what inspired us at Chatfuel a year ago when we started bot builder. We noticed bots becoming hyper-local, i.e. a bot for a soccer team to keep in touch with fans or a small art community bot. Bots are efficient and when you let anyone create them easily magic happens.” — Dmitrii Dumik, Founder of Chatfuel
“Bots go bust” — so went the first of the five AI startup predictions in 2017 by Bradford Cross, countering some recent excitement around conversational AI (see for example O’Reilly’s “Why 2016 is shaping up to be the Year of the Bot”). The main argument was that social intelligence, rather than artificial intelligence is lacking, rendering bots utilitarian and boring.

It won’t be an easy march though once we get to the nitty-gritty details. For example, I heard through the grapevine that when Starbucks looked at the voice data they collected from customer orders, they found that there are a few millions unique ways to order. (For those in the field, I’m talking about unique user utterances.) This is to be expected given the wild combinations of latte vs mocha, dairy vs soy, grande vs trenta, extra-hot vs iced, room vs no-room, for here vs to-go, snack variety, spoken accent diversity, etc. The AI practitioner will soon curse all these dimensions before taking a deep learning breath and getting to work. I feel though that given practically unlimited data, deep learning is now good enough to overcome this problem, and it is only a matter of couple of years until we see these TODA solutions deployed. One technique to watch is Generative Adversarial Nets (GAN). Roughly speaking, GAN engages itself in an iterative game of counterfeiting real stuffs, getting caught by the police neural network, improving counterfeiting skill, and rinse-and-repeating until it can pass as your Starbucks’ order-taking person, given enough data and iterations.


Enter Roof Ai, a chatbot that helps real-estate marketers to automate interacting with potential leads and lead assignment via social media. The bot identifies potential leads via Facebook, then responds almost instantaneously in a friendly, helpful, and conversational tone that closely resembles that of a real person. Based on user input, Roof Ai prompts potential leads to provide a little more information, before automatically assigning the lead to a sales agent.
Ursprünglich rein textbasiert, haben sich Chatbots durch immer stärker werdende Spracherkennung und Sprachsynthese weiterentwickelt und bieten neben reinen Textdialogen auch vollständig gesprochene Dialoge oder einen Mix aus beidem an. Zusätzlich können auch weitere Medien genutzt werden, beispielsweise Bilder und Videos. Gerade mit der starken Nutzung von mobilen Endgeräten (Smartphones, Wearables) wird diese Möglichkeit der Nutzung von Chatbots weiter zunehmen (Stand: Nov. 2016).[10] Mit fortschreitender Verbesserung sind Chatbots dabei nicht nur auf wenige eingegrenzte Themenbereiche (Wettervorhersage, Nachrichten usw.) begrenzt, sondern ermöglichen erweiterte Dialoge und Dienstleistungen für den Nutzer. Diese entwickeln sich so zu Intelligenten Persönlichen Assistenten.
For every question or instruction input to the conversational bot, there must exist a specific pattern in the database to provide a suitable response. Where there are several combinations of patterns available, and a hierarchical pattern is created. In these cases, algorithms are used to reduce the classifiers and generate a structure that is more manageable. This is the “reductionist” approach—or, in other words, to have a simplified solution, it reduces the problem.
Kunze recognises that chatbots are the vogue subject right now, saying: “We are in a hype cycle, and rising tides from entrants like Microsoft and Facebook have raised all ships. Pandorabots typically adds up to 2,000 developers monthly. In the past few weeks, we've seen a 275 percent spike in sign-ups, and an influx of interest from big, big brands.”
From any point in the conversation, the bot needs to know where to go next. If a user writes, “I’m looking for new pants,” the bot might ask, “For a man or woman?” The user may type, “For a woman.” Does the bot then ask about size, style, brand, or color? What if one of those modifiers was already specified in the query? The possibilities are endless, and every one of them has to be mapped with rules.
The NLP system has a wide and varied lexicon to better understand the complexities of natural language. Using an algorithmic process, it determines what has been asked and uses decision trees or slot-based algorithms that go through a predefined conversation path. After it understands the question, the computer then finds the best answer and provides it in the natural language of the user.
Its a chat-bot — For simplicity reasons in this article, it is assumed that the user will type in text and the bot would respond back with an appropriate message in the form of text (So, we will not be concerned with the aspects like ASR, speech recognition, speech to text, text to speech etc., Below architecture can anyways be enhanced with these components, as required).

In a traditional application, the user interface (UI) is a series of screens. A single app or website can use one or more screens as needed to exchange information with the user. Most applications start with a main screen where users initially land and provide navigation that leads to other screens for various functions like starting a new order, browsing products, or looking for help.


As IBM elaborates: “The front-end app you develop will interact with an AI application. That AI application — usually a hosted service — is the component that interprets user data, directs the flow of the conversation and gathers the information needed for responses. You can then implement the business logic and any other components needed to enable conversations and deliver results.”
Die meisten Chatbots greifen auf eine vorgefertigte Datenbank, die sog. Wissensdatenbank mit Antworten und Erkennungsmustern, zurück. Das Programm zerlegt die eingegebene Frage zuerst in Einzelteile und verarbeitet diese nach vorgegebenen Regeln. Dabei können Schreibweisen harmonisiert (Groß- und Kleinschreibung, Umlaute etc.), Satzzeichen interpretiert und Tippfehler ausgeglichen werden (Preprocessing). Im zweiten Schritt erfolgt dann die eigentliche Erkennung der Frage. Diese wird üblicherweise über Erkennungsmuster gelöst, manche Chatbots erlauben darüber hinaus die Verschachtelung verschiedener Mustererkennungen über sogenannte Makros. Wird eine zur Frage passende Antwort erkannt, kann diese noch angepasst werden (beispielsweise können skriptgesteuert berechnete Daten eingefügt werden – „In Ulm sind es heute 37 °C.“). Diesen Vorgang nennt man Postprocessing. Die daraus entstandene Antwort wird dann ausgegeben. Moderne kommerzielle Chatbot-Programme erlauben darüber hinaus den direkten Zugriff auf die gesamte Verarbeitung über eingebaute Skriptsprachen und Programmierschnittstellen.
If AI struggles with fourth-grade science question answering, should AI be expected to hold an adult-level, open-ended chit-chat about politics, entertainment, and weather? It is thus encouraging to see that Microsoft’s Satya Nadella did not give up on Tay after its debacle, and Amazon’s Jeff Bezos is sponsoring an Alexa social chatbot competition. I love this below quote from Jeff:

Die meisten Chatbots greifen auf eine vorgefertigte Datenbank, die sog. Wissensdatenbank mit Antworten und Erkennungsmustern, zurück. Das Programm zerlegt die eingegebene Frage zuerst in Einzelteile und verarbeitet diese nach vorgegebenen Regeln. Dabei können Schreibweisen harmonisiert (Groß- und Kleinschreibung, Umlaute etc.), Satzzeichen interpretiert und Tippfehler ausgeglichen werden (Preprocessing). Im zweiten Schritt erfolgt dann die eigentliche Erkennung der Frage. Diese wird üblicherweise über Erkennungsmuster gelöst, manche Chatbots erlauben darüber hinaus die Verschachtelung verschiedener Mustererkennungen über sogenannte Makros. Wird eine zur Frage passende Antwort erkannt, kann diese noch angepasst werden (beispielsweise können skriptgesteuert berechnete Daten eingefügt werden – „In Ulm sind es heute 37 °C.“). Diesen Vorgang nennt man Postprocessing. Die daraus entstandene Antwort wird dann ausgegeben. Moderne kommerzielle Chatbot-Programme erlauben darüber hinaus den direkten Zugriff auf die gesamte Verarbeitung über eingebaute Skriptsprachen und Programmierschnittstellen.
Chatbots are gaining popularity. Numerous chatbots are being developed and launched on different chat platforms. There are multiple chatbot development platforms like Dialogflow, Chatfuel, Manychat, IBM Watson, Amazon Lex, Mircrosft Bot framework, etc are available using which you can easily create your chatbots. If you are new to chatbot development field and want to jump…
Beyond users, bots must also please the messaging apps themselves. Take Facebook Messenger. Executives have confirmed that advertisements within Discover — their hub for finding new bots to engage with — will be the main way Messenger monetizes its 1.3 billion monthly active users. If standing out among the 100,000 other bots on the platform wasn't difficult enough, we can assume Messenger will only feature bots that don't detract people from the platform.
When one dialog invokes another, the Bot Builder adds the new dialog to the top of the dialog stack. The dialog that is on top of the stack is in control of the conversation. Every new message sent by the user will be subject to processing by that dialog until it either closes or redirects to another dialog. When a dialog closes, it's removed from the stack, and the previous dialog in the stack assumes control of the conversation.
One of the first stepping stones to this future are AI-powered messaging solutions, or conversational bots. A conversational bot is a computer program that works automatically and is skilled in communicating through various digital media—including intelligent virtual agents, organizations' apps, organizations' websites, social platforms and messenger platforms. Users can interact with such bots, using voice or text, to access information, complete tasks or execute transactions. 
Three main reasons are often cited for this reluctance: the first is the human side—they think users will be reluctant to engage with a bot. The other two have more to do with bots’ expected performance: there is skepticism that bots will be able to appropriately incorporate history and context to create personalized experiences and believe they won’t be able to adequately understand human input.
Web site: From Russia With Love. PDF. 2007-12-09. Psychologist and Scientific American: Mind contributing editor Robert Epstein reports how he was initially fooled by a chatterbot posing as an attractive girl in a personal ad he answered on a dating website. In the ad, the girl portrayed herself as being in Southern California and then soon revealed, in poor English, that she was actually in Russia. He became suspicious after a couple of months of email exchanges, sent her an email test of gibberish, and she still replied in general terms. The dating website is not named. Scientific American: Mind, October–November 2007, page 16–17, "From Russia With Love: How I got fooled (and somewhat humiliated) by a computer". Also available online.
The NLP system has a wide and varied lexicon to better understand the complexities of natural language. Using an algorithmic process, it determines what has been asked and uses decision trees or slot-based algorithms that go through a predefined conversation path. After it understands the question, the computer then finds the best answer and provides it in the natural language of the user.

I argued that it is super hard to scale a one-trick TODA into a general assistant that helps the user getting things done across multiple tasks. An intelligence assistant is arguably expected to hold an informal chit-chat with the user. It is this area where we are staring into perhaps the biggest challenge of AI. Observe how Samantha introduces herself to Joaquin Phoenix’s Ted in the clip below:
When considering potential uses, first assess the impact on resources. There are two options here: replacement or empowerment. Replacement is clearly easier as you don’t need to consider integration with existing processes and you can build from scratch. Empowerment enhances an existing process by making it more flexible, accommodating, accessible and simple for users.
The process of building, testing and deploying chatbots can be done on cloud based chatbot development platforms[39] offered by cloud Platform as a Service (PaaS) providers such as Yekaliva, Oracle Cloud Platform, SnatchBot[40] and IBM Watson.[41] [42] [43] These cloud platforms provide Natural Language Processing, Artificial Intelligence and Mobile Backend as a Service for chatbot development.
Another option is to integrate your own custom AI service. This approach is more complex, but gives you complete flexibility in terms of the machine learning algorithm, training, and model. For example, you could implement your own topic modeling and use algorithm such as LDA to find similar or relevant documents. A good approach is to expose your custom AI solution as a web service endpoint, and call the endpoint from the core bot logic. The web service could be hosted in App Service or in a cluster of VMs. Azure Machine Learning provides a number of services and libraries to assist you in training and deploying your models.
The classic historic early chatbots are ELIZA (1966) and PARRY (1972).[10][11][12][13] More recent notable programs include A.L.I.C.E., Jabberwacky and D.U.D.E (Agence Nationale de la Recherche and CNRS 2006). While ELIZA and PARRY were used exclusively to simulate typed conversation, many chatbots now include functional features such as games and web searching abilities. In 1984, a book called The Policeman's Beard is Half Constructed was published, allegedly written by the chatbot Racter (though the program as released would not have been capable of doing so).[14]
×