To inspire the next generation of explorers, NASA reaches out to students in schools, community organizations, and public events. A star robotic ambassador is “Rov-E,” a close replica of real NASA Mars rovers. Through Amazon Lex, NASA staff can now easily navigate Rov-E via voice commands -- an effective conversational interface when speaking with large crowds. Multi-turn dialog management capability enables Rov-E "to talk,” answering students’ questions about Mars in an engaging way. Integration with AWS services allows Rov-E to connect and scale with various data sources to retrieve NASA’s Mars exploration information. 
There has been a great deal of controversy about the use of bots in an automated trading function. Auction website eBay has been to court in an attempt to suppress a third-party company from using bots to traverse their site looking for bargains; this approach backfired on eBay and attracted the attention of further bots. The United Kingdom-based bet exchange Betfair saw such a large amount of traffic coming from bots that it launched a WebService API aimed at bot programmers, through which it can actively manage bot interactions.
I will not go into the details of extracting each feature value here. It can be referred from the documentation of rasa-core link that I provided above. So, assuming we extracted all the required feature values from the sample conversations in the required format, we can then train an AI model like LSTM followed by softmax to predict the next_action. Referring to the above figure, this is what the ‘dialogue management’ component does. Why LSTM is more appropriate? — As mentioned above, we want our model to be context aware and look back into the conversational history to predict the next_action. This is akin to a time-series model (pls see my other LSTM-Time series article) and hence can be best captured in the memory state of the LSTM model. The amount of conversational history we want to look back can be a configurable hyper-parameter to the model.
One key reason: The technology that powers bots, artificial intelligence software, is improving dramatically, thanks to heightened interest from key Silicon Valley powers like Facebook and Google. That AI enables computers to process language — and actually converse with humans — in ways they never could before. It came about from unprecedented advancements in software (Google’s Go-beating program, for example) and hardware capabilities.
I will not go into the details of extracting each feature value here. It can be referred from the documentation of rasa-core link that I provided above. So, assuming we extracted all the required feature values from the sample conversations in the required format, we can then train an AI model like LSTM followed by softmax to predict the next_action. Referring to the above figure, this is what the ‘dialogue management’ component does. Why LSTM is more appropriate? — As mentioned above, we want our model to be context aware and look back into the conversational history to predict the next_action. This is akin to a time-series model (pls see my other LSTM-Time series article) and hence can be best captured in the memory state of the LSTM model. The amount of conversational history we want to look back can be a configurable hyper-parameter to the model.
The process of building a chatbot can be divided into two main tasks: understanding the user's intent and producing the correct answer. The first task involves understanding the user input. In order to properly understand a user input in a free text form, a Natural Language Processing Engine can be used.[36] The second task may involve different approaches depending on the type of the response that the chatbot will generate.

Your first question is how much of it does she want? 1 litre? 500ml? 200? She tells you she wants a 1 litre Tropicana 100% Orange Juice. Now you know that regular Tropicana is easily available, but 100% is hard to come by, so you call up a few stores beforehand to see where it’s available. You find one store that’s pretty close by, so you go back to your mother and tell her you found what she wanted. It’s $2, maybe $3, and after asking her for the money, you go on your way.


Simple chatbots work based on pre-written keywords that they understand. Each of these commands must be written by the developer separately using regular expressions or other forms of string analysis. If the user has asked a question without using a single keyword, the robot can not understand it and, as a rule, responds with messages like “sorry, I did not understand”.
While AppleTV’s commerce capabilities are currently limited to purchasing media from iTunes, it seems likely that Siri’s capabilities would be extended to tvOS apps so app developers will be able to support voice commands from AppleTV directly within their apps. Imagine using voice commands to navigate through Netflix, browse the your Fancy shopping feed, or plan a trip using Tripadvisor on AppleTV — the potential for app developers will be significant if Apple extends its developer platform further into the home through AppleTV and Siri.
If the success of WeChat in China is any sign, these utility bots are the future. Without ever leaving the messaging app, users can hail a taxi, video chat a friend, order food at a restaurant, and book their next vacation. In fact, WeChat has become so ingrained in society that a business would be considered obsolete without an integration. People who divide their time between China and the West complain that leaving this world behind is akin to stepping back in time.
It’s best to have very specific intents, so that you’re clear what your user wants to do, but to have broad entities – so that the intent can apply in many places. For example, changing a password is a common activity (a narrow intent), where you change your password might be many different places (broad entities). The context then personalises the conversation based on what it knows about the user, what they’re trying to achieve, and where they’re trying to do that.
How: instead of asking someone to fill out a form on your website to be contacted by your sales team, you direct them straight into Messenger, where you can ask them some of their contact details and any qualification questions (for example, "How many employees does your company have?"). Depending on what they respond with you could ask if they'd like to arrange a meeting with a salesperson right there and then.
Online chatbots save time and efforts by automating customer support. Gartner forecasts that by 2020, over 85% of customer interactions will be handled without a human. However, the opportunites provided by chatbot systems go far beyond giving responses to customers’ inquiries. They are also used for other business tasks, like collecting information about users, helping to organize meetings and reducing overhead costs. There is no wonder that size of the chatbot market is growing exponentially.

Chatbots and virtual assistants (VAs) may be built on artificial intelligence and create customer experiences through digital personas, but the success you realize from them will depend in large part on your ability to account for the real and human aspects of their deployment, intra-organizational impact, and customer orientation. Start by treating your bots and […]


You may remember Facebook’s big chatbot push in 2016 –  when they announced that they were opening up the Messenger platform to chatbots of all varieties. Every organization suddenly needed to get their hands on the technology. The idea of having conversational chatbot technology was enthralling, but behind all the glitz, glamour and tech sex appeal, was something a little bit less exciting. To quote Gizmodo writer, Darren Orf:
Having a conversation with a computer might have seemed like science fiction even a few years ago. But now, most of us already use chatbots for a variety of tasks. For example, as end users, we ask the virtual assistant on our smartphones to find a local restaurant and provide directions. Or, we use an online banking chatbot for help with a loan application.

As with many 'organic' channels, the relative reach of your audience tends to decline over time due to a variety of factors. In email's case, it can be the over-exposure to marketing emails and moves from email providers to filter out promotional content; with other channels it can be the platform itself. Back in 2014 I wrote about how "Facebook's Likes Don't Matter Anymore" in relation to the declining organic reach of Facebook pages. Last year alone the organic reach of publishers on Facebook fell by a further 52%.
How: this is a relatively simple flow to manage, and it could be one part of a much larger bot if you prefer. All you'll need to do is set up the initial flow within Chatfuel to ask the user if they'd like to subscribe to receive content, and if so, how frequently they would like to be updated. Then you can store their answer as a variable that you use for automation.
Need a Facebook bot? Well, look no further, as Chatfuel makes it easy for you to create your own Facebook and Telegram Chatbot without any coding experience necessary. It works by letting users link to external sources through plugins. Eventually, the platforms hope to open itself to third-party plugins, so anyone can contribute their own plugins and have others benefit from them.
Utility bots solve a user's problem, whatever that may be, via a user-prompted transaction. The most obvious example is a shopping bot, such as one that helps you order flowers or buy a new jacket. According to a recent HubSpot Research study, 47% of shoppers are open to buying items from a bot. But utility bots are not limited to making purchases. A utility bot could automatically book meetings by scanning your emails or notify you of the payment subscriptions you forgot you were signed up for.
Interestingly, the as-yet unnamed conversational agent is currently an open-source project, meaning that anyone can contribute to the development of the bot’s codebase. The project is still in its earlier stages, but has great potential to help scientists, researchers, and care teams better understand how Alzheimer’s disease affects the brain. A Russian version of the bot is already available, and an English version is expected at some point this year.

Chatfuel is a platform that lets you build your own Chatbot for Messenger (and Telegram) for free. The only limit is if you pass more than 100,000 conversations per month, but for most businesses that won't be an issue. No understanding of code is required and it has a simple drag-and-drop interface. Think Wix/Squarespace for bots (side note: I have zero affiliation with Chatfuel).
Respect the conversational UI. The full interaction should take place natively within the app. The goal is to recognize the user's intent and provide the right content with minimum user input. Every question asked should bring the user closer to the answer they want. If you need so much information that you're playing a game of 20 Questions, then switch to a form and deliver the content another way.
Creating a comprehensive conversational flow chart will feel like the greatest hurdle of the process, but know it's just the beginning. It's the commitment to tweaking and improving in the months and years following that makes a great bot. As Clara de Soto, cofounder of Reply.ai, told VentureBeat, "You're never just 'building a bot' so much as launching a 'conversational strategy' — one that's constantly evolving and being optimized based on how users are actually interacting with it."
Human touch. Chatbots, providing an interface similar to human-to-human interaction, are more intuitive and so less difficult to use than a standard banking mobile application. They doesn't require any additional software installation and are more adaptive as able to be personalized during the exploitation by the means of machine learning. Chatbots are instant and so much faster that phone calls, shown to be considered as tedious in some studies. Then they satisfy both speed and personalization requirement while interacting with a bank.
×