How to transform a native app into a Super App
Thanks to GeneXus, any application, regardless if it was developed with GeneXus or not, can become a Super App.
Chatbots
are an opportunity to innovate in terms of interaction and take advantage of information in a timely manner. To grasp their potential, we need to know the concepts that constitute them, such as entities and intents, and the main providers of the necessary services for them to work. In this post, you will find an overview of chatbots and GeneXus. Every day, we see how chatbots are becoming increasingly popular as they are included in new companies and even generating new business models through them. These interfaces open new possibilities when connecting the data from businesses, companies, and institutions with their users’ needs. In this post, we will discuss their inner workings and who the main players are in this ecosystem that has been created around these technologies. Chatbots, as their name suggests, are a combination of chats and bots. So, can we polish these definitions? Yes. We know that a chat is a communication between two or more people. If we pause and pay attention, as users of various chats we’re constantly talking to other people, whether individually or in groups, such as family, work, and so on. During these conversations, we generate a large volume of data that isn’t fully exploited by technology to our benefit. On the other hand, we have bots. By bots we mean software designed to automate actions. Bots have been around us for quite some time; for example, they are used in phone support systems and even videogames. What these examples have in common with chatbots is that they try to emulate a person’s behavior. So, if we combine both concepts, we can automate actions using the information interpreted in a chat. The objective is that the person on the other end doesn’t realize that the answer is provided by a machine. When interacting with a chatbot, we all expect that they behave naturally, as though they were actual people. Also, we expect them to show empathy and that they don’t act like robots, and particularly that they can lead the conversation about what they can do. That is to say, we developers must offer this degree of experience, so that users can feel that they are talking to a person. As for integration, chatbots can be integrated into multiple messaging platforms (WhatsApp, Messenger, WeChat, Telegram, and so on). The choice of platform will depend on the target audience or on the users we need to address. To interact with the Chinese market, integrating the chatbot with WeChat is crucial; on the other hand, to interact with the US market, we must integrate with Facebook Messenger. Chatbots can also be integrated into our own systems, such as Siri in Mac OS and iOS, or Cortana in Windows. In fact, a few months ago, at the GeneXus Meeting (GX27) we introduced our chatbots Rudi and Clarita (RUDI was integrated into the event’s official app and Clarita consumed its services). They were available to answer questions about the program that included more than 150 activities, the locations of various activities and services in the 4-floor venue that we used over 3 days. The number of messages they received during the event, which together reached approximately 4,000, was impressive. This was clear evidence that people are interested in chatbots and that they are useful. At this point, you may be wondering: “how can a chatbot understand us?.” The answer is that they are based on Natural Language Processing (NLP), a field of Artificial Intelligence that studies the interaction that takes place between people, using written and spoken language, and computer systems. As for written sentences, they are structured in predicates, subjects and verbs; in a similar manner, the bot analyzes sentences and classifies them by context, intentions and known entities. These are three important concepts: context, intents and entities. In daily life, context is related to the surroundings where we do something; that is to say, you are now reading this post, and you are aware of your surrounding’s variables: you know the browser in which you’re reading it, the chair on which you’re seating, the time of day, etc. It’s the same for a chatbot, for whom the conversation variables are: username, time, and the parameters it recognizes during the chat. When it comes to intents, we can say that the chatbot can recognize the user’s intent: add a customer or find a customer; to give a more concrete example, in the GX27 App, the chatbot can recognize that the intent is to “find out the location of a room.” “Feeding” the chatbot with a significant number of examples is important for it to be able to recognize intents. Lastly, entities are used to define sets of objects that will contain values, and the corresponding synonyms for each value. For example, we can have a “Colors” entity, with values such as “Green” and “Pink” (for which “Rose” can be a synonym). Also, in the example of the GX27 App, there is a “Rooms” entity and its values which are the “Names” of the rooms. Several platforms allow us to create chatbots and provide the NLP service. In this post, I will mention 3 that are familiar to us because they have been integrated into GeneXus. 1- Watson conversation It’s the IBM platform that allows us to use an IDE to define the intents that our bot will be able to recognize. For each one of these intents, we will give it examples so that it can be trained. It also allows us to define the entities that the bot will handle. Another useful feature of Watson is that it lets us easily define the dialogs that the chatbot will use once it has recognized the user’s intent. In addition, Watson allows us to deploy our chatbot through Slack and will soon offer integration with Facebook Messenger. 2- Dialogflow Dialogflow (formerly API.ai), also allows us to define the intentions and entities that it can handle, but it’s rather different than Watson in the way dialogs are defined, because they are inferred at the same time intentions are defined. Dialogflow lets us integrate chatbots with many messaging platforms, and set the moment in which the chatbot will go to the server to gather information. 3- Lex Here is where Amazon comes into play. Lex works very similarly to Dialogflow when defining intents and entities, but the difference is that it is fully integrated with Amazon Web Services. What these three providers have in common is that they identify intents and entities included in a sentence. They all allow users to send messages, consuming from an API that will be used to send the message and receive a reply once it has been processed. In sum, the flow is as follows: someone makes a question, the NLP interprets it according to how it was trained, and it replies. After reading these descriptions, you may wonder which one is the best provider. Well, this will depend on each particular case and on the nature of the project. For three specific cases, I can recommend these platforms:
After choosing a platform comes the key stage of training the chatbot. This stage implies clearly defining the intents that the bot will recognize, with specific examples of use, as well as accurately defining the entities, values, and synonyms handled from the entities. The user interface also has to be intuitive enough to create a sense of conversation, whether in writing or speech. When both aspects are well combined, they provide an ideal complement to make the most of the information and provide it immediately. Now is the perfect time to explore opportunities and innovations related to how we interact to solve day-to-day situations, from proceedings to transactions or even mass events. If you’ve reached the end of this post, most likely you will want to know how to easily create your own chatbot with GeneXus, and that’s why I suggest that you continue reading: GeneXus Chatbots Generator (link to a post about the generator!).
[…] son Conversations para poder generar este tipo de interfaces. Estamos trabajando fuertemente en un generador de Chatbots, y en la Comunidad quienes tengan generador smart devices (SD) tendrán generador […]