http://www.iaeme.com/
International Journal of Mechanical Engineering and Technology (IJMET)Volume 8, Issue 6, JuneAvailable online at ISSN Print: 0976 © IAEME
ABSTRACTThis chatbot interacts with users in natural language and gives response through
text or speech. The user might enter words which are miss spelled but this bot recognizes the correct words.
The chatbot gives responses to user regarding issues faced in an For gives responses in a way to solve network related issues. If the problem still persists then chatbot will raise a ticket regarding issue.an organization where it helps to solve the basic issues that are encountered by the employees or users. This network, installing a software, hardware issues and log into a Key words:Cite this ArticleDialogue with Chatbot using Machine LearningEngineering http://www.i
1. INTRODUCTIONAgents can be described as NLU (Natural Language Understanding) modules for applications. Their purpose is to transform natural user language occurs when a user input matches one of the intents or domains.components of agents, whereasenabled or disabled in each design an agent once, and then you can integrate it with a variety of platforms using ourand Integrations
1.2. Machine LeaMachine Learning is a tool that allows yourlanguage and convert them into structured data, extracting relevant parameters. In the API.AI terminology, your agent uses machine learning algorithms to match useintents and uses
http://www.iaeme.com/
International Journal of Mechanical Engineering and Technology (IJMET)Volume 8, Issue 6, JuneAvailable online at http://www.iaeme.com/IJMEISSN Print: 0976-6340 and ISSN Online: 0976
© IAEME Publication
TO DIALOGUE
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
ABSTRACT This chatbot interacts with users in natural language and gives response through
text or speech. The user might enter words which are miss spelled but this bot recognizes the correct words.
The chatbot gives responses to user regarding issues faced in an For example, user, might deal with issue regarding network connection, the chatbot gives responses in a way to solve network related issues. If the problem still persists then chatbot will raise a ticket regarding issue.an organization where it helps to solve the basic issues that are encountered by the employees or users. This network, installing a software, hardware issues and log into a Key words: Chatbot, Machine Learning, Network Issue, Natural Language ProcessingCite this ArticleDialogue with Chatbot using Machine LearningEngineering and Technologyhttp://www.iaeme.com/IJME
INTRODUCTIONAgents can be described as NLU (Natural Language Understanding) modules for applications. Their purpose is to transform natural user language occurs when a user input matches one of the intents or domains.components of agents, whereasenabled or disabled in each design an agent once, and then you can integrate it with a variety of platforms using our
Integrations, or download files compatible with your
Machine LeaMachine Learning is a tool that allows yourlanguage and convert them into structured data, extracting relevant parameters. In the API.AI terminology, your agent uses machine learning algorithms to match use
and uses entities
http://www.iaeme.com/IJMET/index.
International Journal of Mechanical Engineering and Technology (IJMET)Volume 8, Issue 6, June 2017, pp.
http://www.iaeme.com/IJME6340 and ISSN Online: 0976
Publication
TO DIALOGUE MACHINE LEARNING
Y Madan Reddy, Chandrasekhara Reddy T and P DayakerDepartment of CSE, MLR Institute of Technology,
This chatbot interacts with users in natural language and gives response through text or speech. The user might enter words which are miss spelled but this bot recognizes the correct words.
The chatbot gives responses to user regarding issues faced in an user, might deal with issue regarding network connection, the chatbot
gives responses in a way to solve network related issues. If the problem still persists then chatbot will raise a ticket regarding issue.an organization where it helps to solve the basic issues that are encountered by the employees or users. This machine learning network, installing a software, hardware issues and log into a
Chatbot, Machine Learning, Network Issue, Natural Language ProcessingCite this Article: Y Madan Reddy, Chandrasekhara Reddy T and P DayakerDialogue with Chatbot using Machine Learning
and Technologyaeme.com/IJME
INTRODUCTION Agents can be described as NLU (Natural Language Understanding) modules for applications. Their purpose is to transform natural user language occurs when a user input matches one of the intents or domains.components of agents, whereasenabled or disabled in each particular agent.design an agent once, and then you can integrate it with a variety of platforms using our
or download files compatible with your
Machine Learning Machine Learning is a tool that allows yourlanguage and convert them into structured data, extracting relevant parameters. In the API.AI terminology, your agent uses machine learning algorithms to match use
entities to extract relevant data from them.
IJMET/index.asp
International Journal of Mechanical Engineering and Technology (IJMET)2017, pp. 729–739, Article ID: IJM
http://www.iaeme.com/IJME6340 and ISSN Online: 0976
Scopus Indexed
TO DIALOGUE WITH CHATBOT USING MACHINE LEARNING
Y Madan Reddy, Chandrasekhara Reddy T and P DayakerDepartment of CSE, MLR Institute of Technology,
Hyderabad
This chatbot interacts with users in natural language and gives response through text or speech. The user might enter words which are miss spelled but this bot recognizes the correct words.
The chatbot gives responses to user regarding issues faced in an user, might deal with issue regarding network connection, the chatbot
gives responses in a way to solve network related issues. If the problem still persists then chatbot will raise a ticket regarding issue.an organization where it helps to solve the basic issues that are encountered by the
machine learning network, installing a software, hardware issues and log into a
Chatbot, Machine Learning, Network Issue, Natural Language ProcessingY Madan Reddy, Chandrasekhara Reddy T and P Dayaker
Dialogue with Chatbot using Machine Learningand Technology, 8(6), 2017, pp. 729
aeme.com/IJMET/issues.asp?JType=IJMET&VType=8&IType=6
Agents can be described as NLU (Natural Language Understanding) modules for applications. Their purpose is to transform natural user language occurs when a user input matches one of the intents or domains.components of agents, whereas domains
particular agent.design an agent once, and then you can integrate it with a variety of platforms using our
or download files compatible with your
Machine Learning is a tool that allows yourlanguage and convert them into structured data, extracting relevant parameters. In the API.AI terminology, your agent uses machine learning algorithms to match use
to extract relevant data from them.
asp 729
International Journal of Mechanical Engineering and Technology (IJMET)Article ID: IJM
http://www.iaeme.com/IJMET/issues.asp?JType=IJME6340 and ISSN Online: 0976-6359
Indexed
WITH CHATBOT USING MACHINE LEARNING
Y Madan Reddy, Chandrasekhara Reddy T and P DayakerDepartment of CSE, MLR Institute of Technology,
Hyderabad, India
This chatbot interacts with users in natural language and gives response through text or speech. The user might enter words which are miss spelled but this bot
The chatbot gives responses to user regarding issues faced in an user, might deal with issue regarding network connection, the chatbot
gives responses in a way to solve network related issues. If the problem still persists then chatbot will raise a ticket regarding issue. The chatbot works an organization where it helps to solve the basic issues that are encountered by the
machine learning Chabot helps to solve few issues related to network, installing a software, hardware issues and log into a
Chatbot, Machine Learning, Network Issue, Natural Language ProcessingY Madan Reddy, Chandrasekhara Reddy T and P Dayaker
Dialogue with Chatbot using Machine Learning., 8(6), 2017, pp. 729
asp?JType=IJMET&VType=8&IType=6
Agents can be described as NLU (Natural Language Understanding) modules for applications. Their purpose is to transform natural user language occurs when a user input matches one of the intents or domains.
domains are pre-defined knowledge packages that can be particular agent. Agents are platform agnostic. You only have to
design an agent once, and then you can integrate it with a variety of platforms using ouror download files compatible with your
Machine Learning is a tool that allows your agentlanguage and convert them into structured data, extracting relevant parameters. In the API.AI terminology, your agent uses machine learning algorithms to match use
to extract relevant data from them.
International Journal of Mechanical Engineering and Technology (IJMET)Article ID: IJMET_08_06
asp?JType=IJME
WITH CHATBOT USING MACHINE LEARNING
Y Madan Reddy, Chandrasekhara Reddy T and P DayakerDepartment of CSE, MLR Institute of Technology,
, India
This chatbot interacts with users in natural language and gives response through text or speech. The user might enter words which are miss spelled but this bot
The chatbot gives responses to user regarding issues faced in an user, might deal with issue regarding network connection, the chatbot
gives responses in a way to solve network related issues. If the problem still persists The chatbot works
an organization where it helps to solve the basic issues that are encountered by the Chabot helps to solve few issues related to
network, installing a software, hardware issues and log into a Chatbot, Machine Learning, Network Issue, Natural Language Processing
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker. International Journal of Mechanical
, 8(6), 2017, pp. 729–739. asp?JType=IJMET&VType=8&IType=6
Agents can be described as NLU (Natural Language Understanding) modules for applications. Their purpose is to transform natural user language into actionable data.occurs when a user input matches one of the intents or domains.
defined knowledge packages that can be Agents are platform agnostic. You only have to
design an agent once, and then you can integrate it with a variety of platforms using ouror download files compatible with your Alexa
agent to understand user inputs in natural language and convert them into structured data, extracting relevant parameters. In the API.AI terminology, your agent uses machine learning algorithms to match use
to extract relevant data from them.
International Journal of Mechanical Engineering and Technology (IJMET) 06_077
asp?JType=IJMET&VType=8&IType=6
WITH CHATBOT USING MACHINE LEARNING
Y Madan Reddy, Chandrasekhara Reddy T and P DayakerDepartment of CSE, MLR Institute of Technology,
This chatbot interacts with users in natural language and gives response through text or speech. The user might enter words which are miss spelled but this bot
The chatbot gives responses to user regarding issues faced in an IT organization. user, might deal with issue regarding network connection, the chatbot
gives responses in a way to solve network related issues. If the problem still persists The chatbot works as a service desk in
an organization where it helps to solve the basic issues that are encountered by the Chabot helps to solve few issues related to
network, installing a software, hardware issues and log into a webpage issues.Chatbot, Machine Learning, Network Issue, Natural Language Processing
Y Madan Reddy, Chandrasekhara Reddy T and P DayakerInternational Journal of Mechanical
asp?JType=IJMET&VType=8&IType=6
Agents can be described as NLU (Natural Language Understanding) modules for applications. into actionable data. This transformation
occurs when a user input matches one of the intents or domains. Intents are developerdefined knowledge packages that can be
Agents are platform agnostic. You only have to design an agent once, and then you can integrate it with a variety of platforms using our
or Cortana
to understand user inputs in natural language and convert them into structured data, extracting relevant parameters. In the API.AI terminology, your agent uses machine learning algorithms to match user requests to specific
T&VType=8&IType=6
WITH CHATBOT USING
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
This chatbot interacts with users in natural language and gives response through text or speech. The user might enter words which are miss spelled but this bot
IT organization. user, might deal with issue regarding network connection, the chatbot
gives responses in a way to solve network related issues. If the problem still persists as a service desk in
an organization where it helps to solve the basic issues that are encountered by the Chabot helps to solve few issues related to
webpage issues. Chatbot, Machine Learning, Network Issue, Natural Language Processing
Y Madan Reddy, Chandrasekhara Reddy T and P DayakerInternational Journal of Mechanical
asp?JType=IJMET&VType=8&IType=6
Agents can be described as NLU (Natural Language Understanding) modules for applications. This transformation
are developerdefined knowledge packages that can be
Agents are platform agnostic. You only have to design an agent once, and then you can integrate it with a variety of platforms using our
apps.
to understand user inputs in natural language and convert them into structured data, extracting relevant parameters. In the API.AI
r requests to specific
T&VType=8&IType=6
WITH CHATBOT USING
This chatbot interacts with users in natural language and gives response through text or speech. The user might enter words which are miss spelled but this bot
IT organization. user, might deal with issue regarding network connection, the chatbot
gives responses in a way to solve network related issues. If the problem still persists as a service desk in
an organization where it helps to solve the basic issues that are encountered by the Chabot helps to solve few issues related to
Chatbot, Machine Learning, Network Issue, Natural Language Processing. Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker. To
International Journal of Mechanical
Agents can be described as NLU (Natural Language Understanding) modules for applications. This transformation
are developer-defined defined knowledge packages that can be
Agents are platform agnostic. You only have to design an agent once, and then you can integrate it with a variety of platforms using our SDKs
to understand user inputs in natural language and convert them into structured data, extracting relevant parameters. In the API.AI
r requests to specific
To Dialogue with Chatbot using Machine Learning
http://www.iaeme.com/IJMET/index.asp 730 [email protected]
The agent “learns” both from the data you provide in it (annotated examples in intents and entries in entities) and from the language models developed by API.AI. Based on this data, it builds a model (algorithm) for making decisions on which intent should be triggered by a user input and what data needs to be extracted. The model is unique to your agent.
The model adjusts dynamically according to the changes made in your agent and in the API.AI platform. To make sure that the model is improving, your agent needs to constantly be trained on real conversation logs. [1]
For a quicker start of your agent’s development, you can use predefined knowledge packages – Domains. We have created them for most requested use cases in our platform. And we continue improving them and adding new ones.
The machine learning model for your agent is updated every time you save changes in intents and entities, approve changes in Training, or import/restore an agent from a zip file.
For agents with more than 50 entities or more than 600 intents, you need to update the model manually. To do so, go to your agent settings > ML Settings and click the 'Train' button.
You can think of an agent as a human in a certain way. Say, you have a new team member who has some knowledge already (similar to Domains for your agent). When you need to teach them something new, you start training them (similar to adding custom intents and entities). Machine learning allows the agent to make decisions based on the new data. It can make mistakes at first (same as humans). When you spend time on training, it starts making better decisions.
2. BACKGROUND STUDY Various researches have been made on machine learning algorithms, we have come across different types of bots which gives responses to user in efficient way, for example slack bot interacts with users in natural language and helps the user with issues, slack bot gave us an idea how to interact with the user in efficient way and to give proper responses.
2.1. Existing Methodology The current system doesn’t interact with user through natural language to process the user queries and give responses. The existing systems doesn’t use any machine learning algorithms to give responses to the user. The current systems can’t understand the natural language which is provide as input by the user and cannot process the natural language input to give proper responses.
3. IMPLEMENTATION The chatbot interacts with user through natural language to process the user queries and give responses to user. This chatbot uses machine learning algorithms to process the given input and to give according responses. This Chabot serves a service desk to an organization where user might face network, hardware, installation or login to a webpage issues, in which user enters any problem as an input and chatbot gives particular solution as response through text or speech. [2] This Chabot helps to solve few issues related to network, installing a software, hardware issues and log into a webpage issues. The following sub sections explains the Chatbot system architecture shown in fig.1. [3]
3.1. Entities Entities represent concepts and serve as a powerful tool for extracting parameter values from natural language inputs. The entities that are used in a particular agent will depend on the
http://www.iaeme.com/
parameter values that are expected to be returned as a result of agent functioning. In other words, a developer need not create entities for every concept mentioned in thefor those required for actionable data.
There are 3 types of entities:developer), andFurthermore, each of these can bereference values), ortype values).
Developer Mapping EntitiesThis entity type allows the mapping of a group of synonyms to a re
In natural language, you can often have many ways to say the same thing. For this reason, each mapping entity has a list of entries, where each entry contains a mapping between a group of synonyms (ways that a particular concept could be and a reference value (which is canonical).
For example, a food type entity could have an entry with a reference value of "vegetarian" with synonyms of "veg" and "veggie".
To create such an entity, leave the checkbox ‘Define reference value and synonyms.
Developer Enum Type EntitiesEnum type entities contain a set of entries that do not have mappings to reference values.To create an enum type entity, uncheck the checkbox ‘Define Synonyms’ and add
Developer Composite EntitiesComposite entities arealiases. They return object type values (if not defined otherwise in the parameter table).
Composite entities are most useful fordifferent attributes.
http://www.iaeme.com/
parameter values that are expected to be returned as a result of agent functioning. In other words, a developer need not create entities for every concept mentioned in thefor those required for actionable data.
There are 3 types of entities:developer), and userFurthermore, each of these can bereference values), ortype values).
Developer Mapping EntitiesThis entity type allows the mapping of a group of synonyms to a re
In natural language, you can often have many ways to say the same thing. For this reason, each mapping entity has a list of entries, where each entry contains a mapping between a group of synonyms (ways that a particular concept could be and a reference value (which is canonical).
For example, a food type entity could have an entry with a reference value of "vegetarian" with synonyms of "veg" and "veggie".
To create such an entity, leave the checkbox ‘Define reference value and synonyms.
Developer Enum Type EntitiesEnum type entities contain a set of entries that do not have mappings to reference values.To create an enum type entity, uncheck the checkbox ‘Define Synonyms’ and add
Developer Composite EntitiesComposite entities arealiases. They return object type values (if not defined otherwise in the parameter table).
Composite entities are most useful fordifferent attributes.
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
http://www.iaeme.com/IJMET/index.
parameter values that are expected to be returned as a result of agent functioning. In other words, a developer need not create entities for every concept mentioned in thefor those required for actionable data.
There are 3 types of entities:user (built for each individual end
Furthermore, each of these can bereference values), or composite
Developer Mapping Entities This entity type allows the mapping of a group of synonyms to a re
In natural language, you can often have many ways to say the same thing. For this reason, each mapping entity has a list of entries, where each entry contains a mapping between a group of synonyms (ways that a particular concept could be and a reference value (which is canonical).
For example, a food type entity could have an entry with a reference value of "vegetarian" with synonyms of "veg" and "veggie".
To create such an entity, leave the checkbox ‘Define reference value and synonyms.
Developer Enum Type EntitiesEnum type entities contain a set of entries that do not have mappings to reference values.To create an enum type entity, uncheck the checkbox ‘Define Synonyms’ and add
Developer Composite EntitiesComposite entities are enum type entitiesaliases. They return object type values (if not defined otherwise in the parameter table).
Composite entities are most useful fordifferent attributes. [4]
Fig
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
IJMET/index.asp
parameter values that are expected to be returned as a result of agent functioning. In other words, a developer need not create entities for every concept mentioned in thefor those required for actionable data.
There are 3 types of entities: system(built for each individual end
Furthermore, each of these can be mappingcomposite (containing other entities with aliases and returning object
This entity type allows the mapping of a group of synonyms to a re
In natural language, you can often have many ways to say the same thing. For this reason, each mapping entity has a list of entries, where each entry contains a mapping between a group of synonyms (ways that a particular concept could be and a reference value (which is canonical).
For example, a food type entity could have an entry with a reference value of "vegetarian" with synonyms of "veg" and "veggie".
To create such an entity, leave the checkbox ‘Define reference value and synonyms.
Developer Enum Type Entities Enum type entities contain a set of entries that do not have mappings to reference values.To create an enum type entity, uncheck the checkbox ‘Define Synonyms’ and add
Developer Composite Entities enum type entities
aliases. They return object type values (if not defined otherwise in the parameter table).Composite entities are most useful for
Figure 1 Chatbot System Architecture
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
asp 731
parameter values that are expected to be returned as a result of agent functioning. In other words, a developer need not create entities for every concept mentioned in the
system (defined by API.AI),(built for each individual end
mapping (having reference values),containing other entities with aliases and returning object
This entity type allows the mapping of a group of synonyms to a reIn natural language, you can often have many ways to say the same thing. For this reason,
each mapping entity has a list of entries, where each entry contains a mapping between a group of synonyms (ways that a particular concept could be and a reference value (which is canonical).
For example, a food type entity could have an entry with a reference value of "vegetarian"
To create such an entity, leave the checkbox ‘Define
Enum type entities contain a set of entries that do not have mappings to reference values.To create an enum type entity, uncheck the checkbox ‘Define Synonyms’ and add
enum type entities whose entries contain other entities used with aliases. They return object type values (if not defined otherwise in the parameter table).
Composite entities are most useful for describing objects or concepts that can have several
Chatbot System Architecture
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
parameter values that are expected to be returned as a result of agent functioning. In other words, a developer need not create entities for every concept mentioned in the
(defined by API.AI),(built for each individual end-user in every request) entities.
(having reference values),containing other entities with aliases and returning object
This entity type allows the mapping of a group of synonyms to a reIn natural language, you can often have many ways to say the same thing. For this reason,
each mapping entity has a list of entries, where each entry contains a mapping between a group of synonyms (ways that a particular concept could be expressed in natural language)
For example, a food type entity could have an entry with a reference value of "vegetarian"
To create such an entity, leave the checkbox ‘Define Synonyms’ checked and add a
Enum type entities contain a set of entries that do not have mappings to reference values.To create an enum type entity, uncheck the checkbox ‘Define Synonyms’ and add
whose entries contain other entities used with aliases. They return object type values (if not defined otherwise in the parameter table).
describing objects or concepts that can have several
Chatbot System Architecture
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
parameter values that are expected to be returned as a result of agent functioning. In other words, a developer need not create entities for every concept mentioned in the
(defined by API.AI), developeruser in every request) entities.
(having reference values), enum typecontaining other entities with aliases and returning object
This entity type allows the mapping of a group of synonyms to a reference value.In natural language, you can often have many ways to say the same thing. For this reason,
each mapping entity has a list of entries, where each entry contains a mapping between a expressed in natural language)
For example, a food type entity could have an entry with a reference value of "vegetarian"
Synonyms’ checked and add a
Enum type entities contain a set of entries that do not have mappings to reference values.To create an enum type entity, uncheck the checkbox ‘Define Synonyms’ and add
whose entries contain other entities used with aliases. They return object type values (if not defined otherwise in the parameter table).
describing objects or concepts that can have several
Chatbot System Architecture
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
parameter values that are expected to be returned as a result of agent functioning. In other words, a developer need not create entities for every concept mentioned in the agent
developer (defined by a user in every request) entities.
enum type (having no containing other entities with aliases and returning object
ference value. In natural language, you can often have many ways to say the same thing. For this reason,
each mapping entity has a list of entries, where each entry contains a mapping between a expressed in natural language)
For example, a food type entity could have an entry with a reference value of "vegetarian"
Synonyms’ checked and add a
Enum type entities contain a set of entries that do not have mappings to reference values.To create an enum type entity, uncheck the checkbox ‘Define Synonyms’ and add entries.
whose entries contain other entities used with aliases. They return object type values (if not defined otherwise in the parameter table).
describing objects or concepts that can have several
parameter values that are expected to be returned as a result of agent functioning. In other agent – only
(defined by a user in every request) entities.
(having no containing other entities with aliases and returning object
In natural language, you can often have many ways to say the same thing. For this reason, each mapping entity has a list of entries, where each entry contains a mapping between a
expressed in natural language)
For example, a food type entity could have an entry with a reference value of "vegetarian"
Synonyms’ checked and add a
Enum type entities contain a set of entries that do not have mappings to reference values. entries.
whose entries contain other entities used with aliases. They return object type values (if not defined otherwise in the parameter table).
describing objects or concepts that can have several
To Dialogue with Chatbot using Machine Learning
http://www.iaeme.com/IJMET/index.asp 732 [email protected]
3.2. Intents An intent represents a mapping between what a user says and what action should be taken by your software. fig.2 shows chatbot architecture flow Intent interfaces have the following sections: User says, Action, Response and Contexts
User says EXAMPLE (“) AND TEMPLATE (@) MODES Each ‘User says’ expression can be in one of two modes: Example Mode (indicated by the “ icon) or Template Mode (indicated by the @ icon). Examples are written in natural language and annotated so that parameter values can be extracted. You can read more on annotation below. Templates contain direct references to entities instead of annotations, i.e., entity names are prefixed with the @ sign.
To toggle between modes, click on the “or @ icon. We recommend using examples rather than templates, because it’s easier and Machine
Learning learns faster this way. And remember: the more examples you add, the smarter your agent becomes.
Example Annotation Annotation is a process (and also the result of such process) of linking a word (or phrase) to an entity.
Automatic Annotation When you add examples to the ‘User says’ section, they are annotated automatically. The system detects the correspondence between words (or phrases) and existing developer and system entities and highlights such words and phrases. It also automatically assigns a parameter name to each detected entity. [5]
Editing Automatically Annotated Examples You can edit the linked entity and the parameter name assigned to it in either the review window that opens when you click on the annotated example, or the parameter table of the ‘Action’ section.
The results of the automatic annotation in the review window and parameter table are synchronized. If you change something in the review window, the respective changes will automatically occur in the parameter table, and vice versa.
Note that changes made in the review window and in the parameter table have different scopes:
Changes in the review window won’t affect other examples containing the same annotations.
Changes in the parameter table will affect all ‘User says’ examples with the same annotation. You can do 3 type of changes:
Assign a different entity to an annotated part of the example
Edit the parameter name
Delete the annotation (i.e., delete the link between the word and the entity).
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
http://www.iaeme.com/IJMET/index.asp 733 [email protected]
Local changes (in one example) To assign a different entity to the annotation in one example, click on the highlighted phrase.
A pop-up window will appear, where you can choose from the list of existing system or developer entities.
In order to change a parameter name in one example, click on the example and edit the parameter name in the review window.
To delete the annotation, click on either the bin icon in the pop-up window or the x icon of the corresponding row in the review window.
Changes for entire intent (in the parameter table) To assign a different entity to the parts highlighted with the same color in all examples, click
on the entity in the parameter table. In the pop-up window, choose from the list of the existing system or developer entities.
To change the parameter name for the annotations through all examples, edit the parameter name in the parameter table.
To delete a particular annotation from all examples, click on the menu icon on the right side of the respective row in the parameter table and select ‘Delete’ from the drop-down menu.
3.3. Manual Annotation If necessary, you can annotate examples manually by selecting a word or phrase and choosing an existing entity or creating a new one in the pop-up window.
Action This section consists of the action name field and the parameter table.
The action name is defined manually. It will be the trigger word for your app to perform a particular action.
Parameters can be filled in automatically from the ‘Users says’ examples and templates, or added manually.
Response In this section, you can define your agent’s responses which will be provided by your application when the intent is triggered.
3.4. Text Response You can improve your agent eloquence by adding several variations of the text response per intent. When the same intent has been triggered more than once, different text response variations will be unrepeatable until all options have been used. It'll help make your agent speech more human-like.
3.5. References to Parameter Values Responses can contain references to parameter values.
If a parameter is present in the parameter table, use the following format to refer to its value in the ‘Text response’ field: $parameter_name.
There are special parameter value types that don’t appear automatically in the parameter table.
To Dialogue with Chatbot using Machine Learning
http://www.iaeme.com/IJMET/index.asp 734 [email protected]
If you need to refer to such a special type of value, you'll have to add a new parameter to the parameter table, define its value manually, and then reference it in the response as $parameter_name. Use the following formats:
$parameter_name.original – to refer to the original value of the parameter
$parameter_name_for_composite_entity.inner_alias – to refer to a value of one of the composite entity components
#context_name.parameter_name – to reference a parameter value collected in some other intent with defined context
Special Characters If you need the dollar sign $ or the number sign # to appear in your agent's text response, use braces around the value that follows $ or #. For example: ${$number} – where $number is a reference to the parameter value #{channel} – where channel is a string #{#channel.name} – where #channel.name is the reference to the parameter value "name" from the context "channel"
If you need to use braces in text responses, use double braces: {{will be displayed as { and }} will be displayed as }.
Handling Empty Parameter Values If an intent is designed in such a way that some parameters can return empty values after the intent has been triggered, the variants of Text response that contain references to the parameters with empty values won't be given as text responses. Make sure to define different variations of Text response for such intents.
For example, if an intent has 2 parameters and may return both or any of them with empty value, and you want to reference parameter values in Text response, make sure to define at least 4 variants of Text response:
referencing both parameter values,
referencing the 1st parameter value,
referencing the 2nd parameter value,
without any reference to the parameter values.
Emojis If you want your agent to display emojis in responses, just copy and paste a desired emoji in the 'Response' field. [8]
Rich Messages If you use one of the following one-click integrations – Facebook Messenger, Kik, Slack, or Telegram – you can define your bot responses as rich messages (images, cards, quick replies etc.) directly in intents.
http://www.iaeme.com/
ContextsContextssources (e.g., user profile, device information, etc). Also, they can be used to manage conversation flow.To define the context
Input contexts serve as a prerequisite for the intent to be matched; i.e., the intent will participate in matching only when
Intents PriorityIntent priority allows to assign more weight to one of the intent in case an input phrase matches multiple intents. Intents priority can be changed by clicking on the blue (default) dot on the left of the intent name and selecting the priority from the
ActionsAn action corresponds to the step your application will take when a specifictriggered by a user input.inputs. In a JSON response to a query, the data appears in the following format{“action”:“action_name”}{“parameter_name”:“parameter_value”}
4. RESULT
4.1. ContextsContexts are differentiating phrases which may be vague or have different meanings depending on the user’s preferences or geographic location, the current page in an app, or the topic oconversation.
http://www.iaeme.com/
Contexts Contexts are designed for passing on information from previous conversations or external sources (e.g., user profile, device information, etc). Also, they can be used to manage conversation flow. To define the context
Input contexts serve as a prerequisite for the intent to be matched; i.e., the intent will participate in matching only when
Intents Priority Intent priority allows to assign more weight to one of the intent in case an input phrase matches multiple intents. Intents priority can be changed by clicking on the blue (default) dot on the left of the intent name and selecting the priority from the
Actions An action corresponds to the step your application will take when a specifictriggered by a user input.
In a JSON response to a query, the data appears in the following format{“action”:“action_name”}{“parameter_name”:“parameter_value”}
RESULTS
Contexts Contexts are strings that represent the current context of a user’s request. This is helpful for differentiating phrases which may be vague or have different meanings depending on the user’s preferences or geographic location, the current page in an app, or the topic oconversation. [6]
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
http://www.iaeme.com/IJMET/index.
are designed for passing on information from previous conversations or external sources (e.g., user profile, device information, etc). Also, they can be used to manage
To define the contexts, click on ‘Define contexts’ right below the intent name.
Input contexts serve as a prerequisite for the intent to be matched; i.e., the intent will participate in matching only when
Intent priority allows to assign more weight to one of the intent in case an input phrase matches multiple intents. Intents priority can be changed by clicking on the blue (default) dot on the left of the intent name and selecting the priority from the
An action corresponds to the step your application will take when a specifictriggered by a user input. Actions can have parameters for extracting
In a JSON response to a query, the data appears in the following format{“action”:“action_name”} {“parameter_name”:“parameter_value”}
strings that represent the current context of a user’s request. This is helpful for differentiating phrases which may be vague or have different meanings depending on the user’s preferences or geographic location, the current page in an app, or the topic o
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
IJMET/index.asp
are designed for passing on information from previous conversations or external sources (e.g., user profile, device information, etc). Also, they can be used to manage
s, click on ‘Define contexts’ right below the intent name.Input contexts serve as a prerequisite for the intent to be matched; i.e., the intent will
participate in matching only when all the
Intent priority allows to assign more weight to one of the intent in case an input phrase matches multiple intents. Intents priority can be changed by clicking on the blue (default) dot on the left of the intent name and selecting the priority from the
An action corresponds to the step your application will take when a specificActions can have parameters for extracting
In a JSON response to a query, the data appears in the following format
{“parameter_name”:“parameter_value”}
Figure 2 Chatbot Architecture F
strings that represent the current context of a user’s request. This is helpful for differentiating phrases which may be vague or have different meanings depending on the user’s preferences or geographic location, the current page in an app, or the topic o
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
asp 735
are designed for passing on information from previous conversations or external sources (e.g., user profile, device information, etc). Also, they can be used to manage
s, click on ‘Define contexts’ right below the intent name.Input contexts serve as a prerequisite for the intent to be matched; i.e., the intent will
contexts in the input context field are active.
Intent priority allows to assign more weight to one of the intent in case an input phrase matches multiple intents. Intents priority can be changed by clicking on the blue (default) dot on the left of the intent name and selecting the priority from the
An action corresponds to the step your application will take when a specificActions can have parameters for extracting
In a JSON response to a query, the data appears in the following format
Chatbot Architecture F
strings that represent the current context of a user’s request. This is helpful for differentiating phrases which may be vague or have different meanings depending on the user’s preferences or geographic location, the current page in an app, or the topic o
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
are designed for passing on information from previous conversations or external sources (e.g., user profile, device information, etc). Also, they can be used to manage
s, click on ‘Define contexts’ right below the intent name.Input contexts serve as a prerequisite for the intent to be matched; i.e., the intent will
contexts in the input context field are active.
Intent priority allows to assign more weight to one of the intent in case an input phrase matches multiple intents. Intents priority can be changed by clicking on the blue (default) dot on the left of the intent name and selecting the priority from the
An action corresponds to the step your application will take when a specificActions can have parameters for extracting
In a JSON response to a query, the data appears in the following format
Chatbot Architecture Flow
strings that represent the current context of a user’s request. This is helpful for differentiating phrases which may be vague or have different meanings depending on the user’s preferences or geographic location, the current page in an app, or the topic o
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
are designed for passing on information from previous conversations or external sources (e.g., user profile, device information, etc). Also, they can be used to manage
s, click on ‘Define contexts’ right below the intent name.Input contexts serve as a prerequisite for the intent to be matched; i.e., the intent will
contexts in the input context field are active.
Intent priority allows to assign more weight to one of the intent in case an input phrase matches multiple intents. Intents priority can be changed by clicking on the blue (default) dot on the left of the intent name and selecting the priority from the drop-down menu.
An action corresponds to the step your application will take when a specificActions can have parameters for extracting information from user
In a JSON response to a query, the data appears in the following format
low
strings that represent the current context of a user’s request. This is helpful for differentiating phrases which may be vague or have different meanings depending on the user’s preferences or geographic location, the current page in an app, or the topic o
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
are designed for passing on information from previous conversations or external sources (e.g., user profile, device information, etc). Also, they can be used to manage
s, click on ‘Define contexts’ right below the intent name. Input contexts serve as a prerequisite for the intent to be matched; i.e., the intent will
contexts in the input context field are active.
Intent priority allows to assign more weight to one of the intent in case an input phrase matches multiple intents. Intents priority can be changed by clicking on the blue (default) dot
down menu.
An action corresponds to the step your application will take when a specific intent has been information from user
strings that represent the current context of a user’s request. This is helpful for differentiating phrases which may be vague or have different meanings depending on the user’s preferences or geographic location, the current page in an app, or the topic o
are designed for passing on information from previous conversations or external sources (e.g., user profile, device information, etc). Also, they can be used to manage
Input contexts serve as a prerequisite for the intent to be matched; i.e., the intent will
Intent priority allows to assign more weight to one of the intent in case an input phrase matches multiple intents. Intents priority can be changed by clicking on the blue (default) dot
has been information from user
strings that represent the current context of a user’s request. This is helpful for differentiating phrases which may be vague or have different meanings depending on the user’s preferences or geographic location, the current page in an app, or the topic of
To Dialogue with Chatbot using Machine Learning
http://www.iaeme.com/IJMET/index.asp 736 [email protected]
Output Contexts Contexts are tied to user sessions (to a session ID that you pass in API calls). If a user expression is matched to an intent, the intent can then set an output context to be shared by this expression in the future. You can also add a context when you send the user request to your API.AI agent.
Input Contexts Input contexts limit intents to be matched only when certain contexts are set.
Extracting Parameter Values from Contexts To refer to a parameter value that has been extracted in an intent with a defined output context, use the following format in the ‘VALUE’ column: #context_name.parameter_name.
4.2. Machine Learning Settings
Machine Learning On/Off You can turn off machine learning for individual intents. When Machine learning is off, the intent will be triggered only if there is an exact match between the user’s input and one of the examples/templates from the intent. The option is available in the menu next to the 'Save' button in every intent. To configure the setting when creating intents via /intents endpoint, use "auto": true for enabling machine learning or "auto": false for disabling machine learning in the intent object. [7]
Match Mode You can switch match mode that fits best your agent. To do this, go to your agent settings > ML Settings > Match Mode. Hybrid (Rule-based and ML) match mode fits best for agents with a small number of examples in intents and/or wide use of templates syntax and composite entities. ML only match mode can be used for agents with a large number of examples in intents, especially the ones using @sys.any or very large developer entities.
4.3. ML Classification Threshold To filter out false positive results and still get the right amount of variability of matched natural language inputs for your agent, you can tune the machine learning classification threshold. If the returned "score" value in the JSON response to a query is less than the threshold value, then a fall back intent will be triggered or, if there is no fall back intents defined, no intent will be triggered.
Note that fall back intents return "score": 1 and when no intent is triggered, "score": 0 is returned. To modify this setting, go to your agent settings > ML Settings > ML Classification Threshold and type in a new threshold value. Click 'Save' and then 'Train' to retrain your agent
4.4. Integrations We've integrated with many popular messaging, virtual assistant and IoT platforms. See how quick and easy it is to expose your agent to new channels of users.
API.AI Skype Integration allows you to easily create Skype bots with natural language understanding based on the API.AI technology.
http://www.iaeme.com/
Fig
fig.3 and fig.4between user and the authorities.
Fig.5 shows the intelligent assistant messages with the userid.
http://www.iaeme.com/
Figure 3 Ticket Rise on Chatbot
fig.3 and fig.4
between user and the authorities.
Fig.5 shows the intelligent assistant messages with the userid.
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
http://www.iaeme.com/IJMET/index.
Ticket Rise on Chatbot
shows the chatbot application
between user and the authorities.
Fig.5 shows the intelligent assistant messages with the userid.
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
IJMET/index.asp
Ticket Rise on Chatbot
shows the chatbot application
between user and the authorities.
Figure 5 Intellig
Fig.5 shows the intelligent assistant chatbot,
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
asp 737
Fig
communicationsshows the chatbot application by ticket rising and service desk messaging
Intelligent assistant
chatbot, where the user can able to see the
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
Figure 4 Ticket Rise on Chatbot with two
communicationsby ticket rising and service desk messaging
ent assistant chatbot
where the user can able to see the
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
Ticket Rise on Chatbot with two
communications Service Deskby ticket rising and service desk messaging
chatbot
where the user can able to see the
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
Ticket Rise on Chatbot with two
Service Desk by ticket rising and service desk messaging
where the user can able to see the end to end
Ticket Rise on Chatbot with two
by ticket rising and service desk messaging
end to end
To Dialogue with Chatbot using Machine Learning
http://www.iaeme.com/IJMET/index.asp 738 [email protected]
4.5. Training To achieve good classification accuracy, it’s important to provide your agent with enough data. The greater is the number of natural language examples in the ‘User says’ section of intents, the better is the classification accuracy. We encourage you to use example mode instead of template mode since the former provides better data for machine learning.
When you create a new intent, start with examples with the maximum number parameters. This way you will define what entities should be used in this intent and name all the parameters the right way. Having annotated the first few long examples, it will be easier for you to continue with shorter ones, as the system will start suggesting you the right entities for new examples. It also helps machine learning models train better. [4]
To make the training process more efficient, we have created a Training tool that allows you to analyse conversation logs with your agent and add annotated examples to relevant intents in bulk.
Training is currently supported for English, German, Spanish, French, Italian, Russian, and Simplified Chinese.
How It Works As you and your users chat with your agent, you can access the conversation logs by clicking ‘Training’ in the left side menu.
For your convenience, the logs are presented in two views: Training shows conversations with the agent in the way, so you could review and improve its performance. Each user request is presented in a card showing the intent that will be used to process it, as well as the current annotation. You can reassign inputs to correct intents and fix annotations. Every time you approve changes, the agent is trained, and the results in the tab are updated. History displays conversations in plain mode. This way you can see latest conversations with your agent in chronological order.
Upload Examples You can upload sample user inputs in a .txt file or in a .zip archive with multiple (up to 10) .txt files. Each input should start from a new line. A single .txt file or unpacked .zip archive should not exceed 3 MB. Just click the 'Upload' button in the right upper corner.
5. CONCLUSION AND FUTURE SCOPE In this paper, we have discussed chatbot by using machine learning techniques a bot which processes natural language input text and gives proper response according to the given question. And, we have shown how to train bot according to scripts and to increase the accuracy of relevance response retrieval. We have shown how to give response from a bot to an user in an efficient way. We would like to extend this bot to handle further more service desk related issues. We would like to add more number of user says examples.
Y Madan Reddy, Chandrasekhara Reddy T and P Dayaker
http://www.iaeme.com/IJMET/index.asp 739 [email protected]
REFERENCES [1] Shawar, Bayan Abu, and Eric Steven Atwell. "Using corpora in machine-learning chatbot
systems." International journal of corpus linguistics 10.4 (2005): 489-516.
[2] Shawar, Bayan Abu, and Eric Atwell. "Machine Learning from dialogue corpora to generate hatbots." Expert Update journal 6.3 (2003): 25-29.
[3] Shawar, Bayan Abu, and Eric Atwell. "Chatbots: are they really useful?" LDV Forum. Vol. 22. No.1.2007.
[4] Fryer, L. K., and Rollo Carpenter. "Bots as language learning tools." Language Learning & Technology (2006).
[5] Huang, Jizhou, Ming Zhou, and Dan Yang. "Extracting Chatbot Knowledge from Online Discussion Forums." IJCAI. Vol. 7. 2007.
[6] Wu, Yu, et al. "Automatic chatbot knowledge acquisition from online forum via rough set and ensemble learning." Network and Parallel Computing, 2008. NPC 2008. IFIP International Conference on. IEEE, 2008.
[7] Zander, Sebastian, Thuy Nguyen, and Grenville Armitage. "Automated traffic classification and application identification using machine learning." Local Computer Networks, 2005. 30th Anniversary. The IEEE Conference on. IEEE, 2005.
[8] Read, Jonathon. "Using emoticons to reduce dependency in machine learning techniques for sentiment classification." Proceedings of the ACL student research workshop. Association for Computational Linguistics, 2005.
[9] Karna Patel, Mrudang Patel and Nirav Oza, Wireless automation and Machine Learning of a Rolling-Mill Using Arduino and Android. International Journal of Mechanical Engineering and Technology, 7(6), 2016, pp. 09–21.
[10] Taran Singh Bharati and R. Kumar. Intrusion Detection System for Manet Using Machine Learning and State Transition Analysis. International Journal of Computer Engineering and Technology, 6 (12), 2015, pp. 01-08.