Chapter 18: Learning With Decision Trees - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Chapter 18: Learning With Decision Trees

Description:

Example: Chair, TV, Keys, Car, Book, Headphones, extension cord. Examples ... Edges correspond to values of features. E.g. Floor: 1,2,3. Which feature should be ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 24
Provided by: dUmn
Category:

less

Transcript and Presenter's Notes

Title: Chapter 18: Learning With Decision Trees


1
Chapter 18 Learning With Decision Trees
  • Lecture By
  • Ben Ulfers

2
Why do we want to learn?
  • Unpredicted or Changing Environments
  • A learning agent may be able to deal with
    environments its creator did not predict
  • Efficiency
  • Reduce the amount of redundant work
  • Example
  • Encounter an event E
  • Explore the outcomes of multiple actions
  • Discover action B to be the best
  • Remember If event E then do action B
  • Quicker than going through all possible actions
    again

3
Measurements of Learning
  • Generalization
  • Can what we learn help us deal with a diverse set
    of events?
  • How well does it deal with future problems?
  • We should asses issues such as training time and
    quality of performance
  • Speed Up
  • Does the speed of the operation increase with
    repetition?

4
Different Forms Of Learning
  • Category Learning
  • Contingency Learning
  • Assosiation Learning
  • Language Learning
  • Skill Learning
  • Concept Learning
  • Theroy Learning

5
Category Learning
  • Given a series of objects determine what the
    concept or category is
  • Example Can of Pop, Bottle of water, Cup of
    coffee
  • Example Chair, TV, Keys, Car, Book, Headphones,
    extension cord

6
Examples
  • To learn we must have data
  • Data must be related to the concept we are
    learning
  • We must have learning features called examples
  • Two types of examples
  • Positive Examples
  • Members of the category
  • Negative Examples
  • Not Members of the category

7
Examples Features
  • Examples have features
  • An Example of an Example Chair
  • Has 4 legs
  • Has arms
  • Has a back
  • Has Wheels
  • Is Blue

8
Types of Learning
  • Supervised Learning
  • Unsupervised Learning
  • Reinforcement Learning

9
Supervised Learning
  • Learning from examples which include both the
    input and output
  • Analogous to having a teacher
  • Has a training and testing/execution phase
  • Only the training phase is supervised

10
Unsupervised Learning
  • Learning from examples consisting only of input
  • No feedback from enviroment

11
Reinforcement Learning
  • Examples still include only input, but produce
    reinforcement
  • Carrot and Stick
  • Put a cookie in my mouth -gtYum
  • Grab a red hot iron -gt Ouch!

12
Online vs Offline
  • Online
  • Uses environmental feedback
  • Active learning agents learn as they go
  • Outputs produce New Inputs
  • Offline
  • Batch learning
  • No environmental feedback
  • Online learning requires a real time algorithm

13
Inductive Learning of functions
  • Given examples of F of the form (x,f(x)) find a
    function H that approximates F
  • H is our hypothesis, we must chose a hypothesis
    space HS for example all polynomials of degree
    less than k
  • A consistent hypothesis agrees with all of the
    data
  • What about multiple consistent hypotheses?
  • Ockham's razor

14
Inductive Learning of functions
  • A leaning problem is unrealizable if HS does not
    contain the true function
  • Ex. Sinusoidal functions in a finite polynomial
    space

15
Decision Trees
  • Takes a set of examples with features as input
  • Goal
  • Formulate a rule to classify the examples based
    on their features
  • Rule generates an answer
  • Rough Algorithm
  • Decide on the category we want to learn
  • Select a set of relevant attributes from examples
  • Generate Decision Tree
  • Tree gives you rules to classify other examples

16
Recycling Bin Problem
  • Category Rooms with recycling bins
  • Examples Rooms
  • Attributes
  • Floor (1st ,2nd ,3rd )
  • Status of occupant (Student,faculty,staff)
  • Department (cs,ee)
  • Size (Small, Medium,Large)
  • Recycling Bin (yes,no)

17
Data
18
Building A Decision Tree
  • Tree Nodes correspond to features
  • E.g. floor,size,department ect.
  • Edges correspond to values of features
  • E.g. Floor 1,2,3
  • Which feature should be our root? Which feature
    should be next?
  • First try an arbitrary order
  • 1) Size 2) Department 3) Status 4) Floor

19
Solution
Size
307, No 415, Yes 517, Yes
309, No 316, Yes
408, Yes 509, No 420, No
20
Solution
Size
Dept
Dept
Dept
309, No 316, Yes
517, Yes
307, No 415, Yes
408, Yes 509, No 420, No
21
Solution
Size
Dept
Dept
Dept
Status
Status
517, Yes
Status
316, Yes
307, No
415, Yes
309, No
408, Yes
509, No 420, No
22
Decision Tree Rules
  • What set of rules did this tree produce
  • Should we create a Positive or Negative set of
    rules?
  • Positive Set (What is)
  • All Students in the EE department with Small
    offices have bins
  • All faculty in CS with Medium office have bins
  • All Students in EE with Large offices have bins
  • Everyone in CS with Large offices
  • Negative Set (What isn't)
  • No Staff in EE with Small Offices get bins
  • No Staff in CS with Medium Offices get bins
  • No Faculty in EE with Large Offices get bins
  • Which is better?

23
Decision Tree Rules
  • May be possible to come up with more compact rule
    sets
  • All Students have bins
  • All CS faculty have bins
  • How do we generate simpler rules
  • Related to tree size
  • Generally want more compact trees
  • Are simple rules always better?
Write a Comment
User Comments (0)
About PowerShow.com