+ All Categories
Home > Documents > DescisionTree

DescisionTree

Date post: 29-May-2018
Category:
Upload: dipendra-ghimire
View: 223 times
Download: 0 times
Share this document with a friend

of 18

Transcript
  • 8/9/2019 DescisionTree

    1/18

    What is a decision tree ?

    Decision tree is a classifier in the form of a tree structure , where eachnode is either:

    a leaf node - indicates the value of the target attribute (class) of examples

    a decision node - specifies some test to be carried out on a single attribute-value,with one branch and sub-tree for each possible outcome of the test.

    Decision Trees are useful tools for helping you to choose betweenseveral courses of action

    B K=X

    Decision Node Leaf Node

  • 8/9/2019 DescisionTree

    2/18

    What is a decision tree ? (contd..)

    Particularly useful for choosing between different strategies, projectsor investment opportunities, particularly when your resources arelimited.

    Provide a highly effective structure within which you can explore

    options, and investigate the possible outcomes of choosing thoseoptions

  • 8/9/2019 DescisionTree

    3/18

    An example of a simple decision tree

  • 8/9/2019 DescisionTree

    4/18

    What is a decision tree ? (contd..)

    A decision tree can be used to classify an example by starting at theroot of the tree and moving through it until a leaf node, whichprovides the classification of the instance.

    Decision Tree Representation:

    Each internal node test an attribute Each branch corresponds to attribute value

    Each leaf node assigns a classification

  • 8/9/2019 DescisionTree

    5/18

    When to consider Decision Tree ?

    Instances describable by attributes value pairs

    Target Function is discrete valued

    Possibly noisy training data

    Examples

    Equipment or medical diagnosis

    Credit risk analysis

    Modeling calendar scheduling preferences

  • 8/9/2019 DescisionTree

    6/18

    Converting Decision to Rules

    IF ((A=Red) ^( B< 4.5))

    THEN K=Y

    IF((A=Red ) ^ (B>=4.5))THEN K=X

    Etc..

  • 8/9/2019 DescisionTree

    7/18

    The strengths of decision tree

    Decision trees :

    are Simple to understand and interpret. People are able to understand decision tree models after a brief

    explanation.

    are able to generate understandable rules.

    requires little data preparation

    Other techniques often require data normalization, dummy variablesneed to be created and blank values to be removed.

    perform classification without requiring much computation.

    possible to validate a model using statistical tests. makes it possible to account for the reliability of the model.

    are able to handle both continuous and categorical variables. Ex: relation rules can be used only with nominal variables while neural

    networks can be used only with numerical variables.

    provide a clear indication of which fields are most importantfor prediction or classification.

  • 8/9/2019 DescisionTree

    8/18

    Weaknesses of decision tree

    Decision trees are less appropriate for estimation tasks wherethe goal is to predict the value of a continuous attribute.

    Decision-tree learners create over-complex trees that do notgeneralise the data well

    Decision tree can be computationally expensive.-The process of growing a decision tree is computationally expensive. At each

    node, each candidate splitting field must be sorted before its best split canbe found. In some algorithms, combinations of fields are used and a searchmust be made for optimal combining weights.

    There are concepts that are hard to learn because decision trees

    do not express them easily

  • 8/9/2019 DescisionTree

    9/18

    Practical Example 1:

  • 8/9/2019 DescisionTree

    10/18

  • 8/9/2019 DescisionTree

    11/18

  • 8/9/2019 DescisionTree

    12/18

    In the example in Figure 2, the value for 'new product,thorough development' is:

    0.4 (probability good outcome) x $1,000,000 (value) = $400,000

    0.4 (probability moderate outcome) x 50,000 (value) = $20,000 0.2 (probability poor outcome) x 2,000 (value) = $400

    + $420,400

  • 8/9/2019 DescisionTree

    13/18

  • 8/9/2019 DescisionTree

    14/18

    Practical Example 2:

    Assume XYZ Corporation wishes tointroduce one of two products to themarket this year. The probabilities and

    present values (PV) of projected cashinflows follow:

  • 8/9/2019 DescisionTree

    15/18

    Practical Example(contd..)

  • 8/9/2019 DescisionTree

    16/18

    Based on the expected net present value, the company should chooseproductA over product B.

  • 8/9/2019 DescisionTree

    17/18

    Queries,

    Comments

    &

    Suggestions

    ???

  • 8/9/2019 DescisionTree

    18/18

    References

    http://dms.irb.hr/tutorial/tut_dtrees.php

    http://www.mindtools.com/dectree.html

    http://en.wikipedia.org/wiki/Decision_tree_learning

    http://www.answers.com/topic/decision-

    tree