+ All Categories
Home > Documents > Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique...

Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique...

Date post: 14-Dec-2015
Category:
Upload: donald-noblett
View: 219 times
Download: 1 times
Share this document with a friend
Popular Tags:
33
Rulebase Expert System and Uncertainty
Transcript

Rulebase Expert System and Uncertainty

Rule-based ES

• Rules as a knowledge representation technique

• Type of rules :- relation, recommendation, directive, strategy and heuristic

ES development tean

Project manager

Knowledge engineer ProgrammerDomain expert

End-user

Expert

Structure of a rule-based ES

Rule: IF-THEN Fact

Knowledge base Database

Inference engine

Explanation facilities

User interface Developer interface

User Knowledge engineer

Externaldatabase

External program

Structure of a rule-based ES• Fundamental characteristic of an ES

– High quality performance• Gives correct results• Speed of reaching a solution• How to apply heuristic

– Explanation capability• Although certain rules cannot be used to justify a

conclusion/decision, explanation facility can be used to expressed appropriate fundamental principle.

– Symbolic reasoning

Structure of a rule-based ES• Forward and backward chaining inference

Rule: IF A is x THEN is y

Fact: A is xFact: B is y

Knowledge base

Database

Match Fire

Conflict Resolution• Example

– Rule 1:IF the ‘traffic light’ is green

THEN the action is go

– Rule 2:IF the ‘traffic light’ is red

THEN the action is stop

– Rule 3:IF the ‘traffic light’ is red

THEN the action is go

Conflict Resolution Methods

• Fire the rule with the highest priority– example

• Fire the most specific rules– example

• Fire the rule that uses the data most recently entered in the database - time tags attached to the rules– example

Uncertainty Problem

• Sources of uncertainty in ES– Weak implication– Imprecise language– Unknown data– Difficulty in combining the views of different

experts

Uncertainty Problem

• Uncertainty in AI– Information is partial– Information is not fully reliable– Representation language is inherently imprecise– Information comes from multiple sources and it

is conflicting– Information is approximate– Non-absolute cause-effect relationship exist

Uncertainty Problem

• Representing uncertain information in ES– Probabilistic– Certainty factors– Theory of evidence– Fuzzy logic– Neural Network– GA– Rough set

Uncertainty Problem

• Representing uncertain information in ES– Probabilistic– Certainty factors– Theory of evidence– Fuzzy logic– Neural Network– GA– Rough set

Uncertainty Problem

• Representing uncertain information in ES– Probabilistic

• The degree of confidence in a premise or a conclusion can be expressed as a probability

• The chance that a particular event will occur

eventsofnumberTotal

ofoccurencethefavoringoutcomesofNumberXP )(

Uncertainty Problem

• Representing uncertain information in ES– Bayes Theorem

• Mechanism for combining new and existent evidence usually given as subjective probabilities

• Revise existing prior probabilities based on new information

• The results are called posterior probabilities

eventsofnumberTotal

ofoccurencethefavoringoutcomesofNumberXP )(

Uncertainty Problem

• Bayes theorem

– P(A/B) = probability of event A occuring, given that B has already occurred (posterior probability)

– P(A) = probability of event A occuring (prior probability)

– P(B/A) = additional evidence of B occuring, given A;– P(not A) = A is not going to occur, but another event is

P(A) + P(not A) = 1

)(*)/()()/(

))(*/()/(

AnotPAnotBPAPABp

APABPBAP

Uncertainty Problem• Representing uncertain information in ES

– Probabilistic– Certainty factors– Theory of evidence– Fuzzy logic– Neural Network– GA– Rough set

Uncertainty Problem

• Representing uncertain information in ES– Certainty factors

• Uncertainty is represented as a degree of belief

• 2 steps– Express the degree of belief

– Manipulate the degrees of belief during the use of knowledge based systems

• Based on evidence (or the expert’s assessment)

• Refer pg 74

Certainty Factors• Form of certainty factors in ES

IF <evidence>THEN <hypothesis> {cf }

• cf represents belief in hypothesis H given that evidence E has occurred

• Based on 2 functions– Measure of belief MB(H, E)– Measure of disbelief MD(H, E)

• Indicate the degree to which belief/disbelief of hypothesis H is increased if evidence E were observed

Certainty Factors• Uncertain term and their intepretation

Term Certainty Factor

Definitely not -1.0

Almost certainly not -0.8

Probably not -0.6

Maybe not -0.4

Unknown -0.2 to +0.2

Maybe +0.4

Probably +0.6

Almost certainly +0.8

Definitely +1.0

Certainty Factors• Total strength of belief and disbelief in a

hypothesis (pg 75)

)],(),,(min[1

),(),(

EHMDEHMB

EHMDEHMBcf

Certainty Factors• Example : consider a simple rule

IF A is X

THEN B is Y

– In usual cases experts are not absolute certain that a rule holds

IF A is X

THEN B is Y {cf 0.7};

B is Z {cf 0.2}

• Interpretation; how about another 10%

• See example pg 76

Certainty Factors• Certainty factors for rules with multiple

antecedents– Conjunctive rules

• IF <E1> AND <E2> …AND <En> THEN <H> {cf}

• Certainty for H is

cf(H, E1 E2 … En)= min[cf(E1), cf(E2),…, cf(En)] x cf

See example pg 77

Certainty Factors• Certainty factors for rules with multiple

antecedents– Disjunctive rules rules

• IF <E1> OR <E2> …OR <En> OR <H> {cf}

• Certainty for H is

cf(H, E1 E2 … En)= max[cf(E1), cf(E2),…, cf(En)] x cf

See example pg 78

Certainty Factors• Two or more rules effect the same hypothesis

– E.g– Rule 1 : IF A is X THEN C is Z {cf 0.8}

IF B is Y THEN C is Z {cf 0.6}

Refer eq.3.35 pg 78 : combined certainty factor

Uncertainty Problem• Representing uncertain information in ES

– Probabilistic– Certainty factors– Theory of evidence– Fuzzy logic– Neural Network– GA– Rough set

Theory of evidence

• Representing uncertain information in ES• A well known procedure for reasoning with

uncertainty in AI

• Extension of bayesian approach

• Indicates the expert belief in a hypothesis given a piece of evidence

• Appropriate for combining expert opinions

• Can handle situation that lack of information

Rough set approach

• Rules are generated from dataset– Discover structural relationships within

imprecise or noisy data– Can also be used for feature reduction

• Where attributes that do not contributes towards the classification of the given training data can be identified or removed

Rough set approach:Generation of Rules

[E1, {a, c}], [E2, {a, c},{b,c}],[E3, {a}],[E4, {a}{b}],[E5, {a}{b}]

a1c3 d1a1c1 d2,b2c1 d2a2 d2 b3 d2a3 d3,a3 d4b5 d3,b5 d4

Reducts

Equivalence Classes

Rules

Class a b c dec

E1 1 2 3 1 E2 1 2 1 2 E3 2 2 3 2 E4 2 3 3 2 E5,1 3 5 1 3 E5,2 3 5 1 4

Rough set approach:Generation of Rules

Class Rules Membership Degree

E1 a1c3 d1 50/50 = 1

E2 a1c1 d2 5/5 = 1

E2 b2c1 d2 5/5 = 1

E3, E4 a2 d2 40/40 = 1

E4 b3 d2 10/10 = 1

E5 a3 d3 4/5 = 0.8

E5 a3 d4 1/5 = 0.2

E5 b5 d3 4/5 = 0.8

E5 b5 d4 1/5 = 0.2

 

Rules Measurements : Support

Given a description contains a conditional part and the decision part , denoting a decision rule . The support of the pattern is a number of objects in the information system A has the property described by .

  

The support of is the number of object in the IS A that have the decision described by .

 

The support for the decision rule is the probability of that an object covered by the description is belongs to the class.

 

)(sup port

)(sup port

)(sup)(sup portport

Rules Measurement : Accuracy

The quantity accuracy ( ) gives a measure of how trustworthy the rule is in the condition . It is the probability that an arbitrary object covered by the description belongs to the class. It is identical to the value of rough membership function applied to an object x that match . Thus accuracy measures the degree of membership of x in X using attribute B.

)(sup

)(sup)(

port

portAccuracy

Rules Measurement : Coverage

Coverage gives measure of how well the pattern describes the decision class defined through . It is a probability that an arbitrary object, belonging to the class C is covered by the description D.

)(sup

)(sup)(

port

portCoverage

Complete, Deterministic and Correct Rules

The rules are said to be complete if any object belonging to the class is covered by the description coverage is 1 while deterministic rules are rules with the accuracy is 1. The correct rules are rules with both coverage and accuracy is 1.


Recommended