Question? Leave a message!




Data Mining Classification: Alternative Techniques

Data Mining Classification: Alternative Techniques
Dr.JakeFinlay Profile Pic
Dr.JakeFinlay,Germany,Teacher
Published Date:22-07-2017
Website URL
Comment
Data Mining Classification: Alternative Techniques © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 1Rule-Based Classifier  Classify records by using a collection of “if…then…” rules  Rule: (Condition)  y – where  Condition is a conjunctions of attributes  y is the class label – LHS: rule antecedent or condition – RHS: rule consequent – Examples of classification rules:  (Blood Type=Warm)  (Lay Eggs=Yes)  Birds  (Taxable Income 50K)  (Refund=Yes)  Evade=No © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Rule-based Classifier (Example) Name Blood Type Give Birth Can Fly Live in Water Class human warm yes no no mammals python cold no no no reptiles salmon cold no no yes fishes whale warm yes no yes mammals frog cold no no sometimes amphibians komodo cold no no no reptiles bat warm yes yes no mammals pigeon warm no yes no birds cat warm yes no no mammals leopard shark cold yes no yes fishes turtle cold no no sometimes reptiles penguin warm no no sometimes birds porcupine warm yes no no mammals eel cold no no yes fishes salamander cold no no sometimes amphibians gila monster cold no no no reptiles platypus warm no no no mammals owl warm no yes no birds dolphin warm yes no yes mammals eagle warm no yes no birds R1: (Give Birth = no)  (Can Fly = yes)  Birds R2: (Give Birth = no)  (Live in Water = yes)  Fishes R3: (Give Birth = yes)  (Blood Type = warm)  Mammals R4: (Give Birth = no)  (Can Fly = no)  Reptiles R5: (Live in Water = sometimes)  Amphibians © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Application of Rule-Based Classifier  A rule r covers an instance x if the attributes of the instance satisfy the condition of the rule R1: (Give Birth = no)  (Can Fly = yes)  Birds R2: (Give Birth = no)  (Live in Water = yes)  Fishes R3: (Give Birth = yes)  (Blood Type = warm)  Mammals R4: (Give Birth = no)  (Can Fly = no)  Reptiles R5: (Live in Water = sometimes)  Amphibians Name Blood Type Give Birth Can Fly Live in Water Class hawk warm no yes no ? grizzly bear warm yes no no ? The rule R1 covers a hawk = Bird The rule R3 covers the grizzly bear = Mammal © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Rule Coverage and Accuracy Tid Refund Marital Taxable  Coverage of a rule: Class Status Income 1 Yes Single 125K No – Fraction of records 2 No Married 100K No that satisfy the 3 No Single 70K No antecedent of a rule 4 Yes Married 120K No 5 No Divorced 95K Yes  Accuracy of a rule: 6 No Married 60K No – Fraction of records 7 Yes Divorced 220K No that satisfy both the 8 No Single 85K Yes 9 No Married 75K No antecedent and 10 No Single 90K Yes consequent of a 10 (Status=Single)  No rule Coverage = 40%, Accuracy = 50% © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›How does Rule-based Classifier Work? R1: (Give Birth = no)  (Can Fly = yes)  Birds R2: (Give Birth = no)  (Live in Water = yes)  Fishes R3: (Give Birth = yes)  (Blood Type = warm)  Mammals R4: (Give Birth = no)  (Can Fly = no)  Reptiles R5: (Live in Water = sometimes)  Amphibians Name Blood Type Give Birth Can Fly Live in Water Class lemur warm yes no no ? turtle cold no no sometimes ? dogfish shark cold yes no yes ? A lemur triggers rule R3, so it is classified as a mammal A turtle triggers both R4 and R5 A dogfish shark triggers none of the rules © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Characteristics of Rule-Based Classifier  Mutually exclusive rules – Classifier contains mutually exclusive rules if the rules are independent of each other – Every record is covered by at most one rule  Exhaustive rules – Classifier has exhaustive coverage if it accounts for every possible combination of attribute values – Each record is covered by at least one rule © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›From Decision Trees To Rules Classification Rules (Refund=Yes) == No Refund Yes No (Refund=No, Marital Status=Single,Divorced, Taxable Income80K) == No NO NO Marital (Refund=No, Marital Status=Single,Divorced, Status Single, Taxable Income80K) == Yes Married Divorced (Refund=No, Marital Status=Married) == No NO NO Taxable Income 80K 80K NO NO Y YE ES S Rules are mutually exclusive and exhaustive Rule set contains as much information as the tree © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Rules Can Be Simplified Tid Refund Marital Taxable Cheat Status Income Refund 1 Yes Single 125K No Yes No 2 No Married 100K No NO NO Marital 3 No Single 70K No Status Single, Married 4 Yes 120K Married No Divorced 5 No Divorced 95K Yes NO NO Taxable 6 No Married 60K No Income 7 Yes Divorced 220K No 80K 80K 8 No Single 85K Yes NO NO Y YE ES S 9 No Married 75K No 10 No Single 90K Yes 10 Initial Rule: (Refund=No)  (Status=Married)  No Simplified Rule: (Status=Married)  No © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Effect of Rule Simplification  Rules are no longer mutually exclusive – A record may trigger more than one rule – Solution?  Ordered rule set  Unordered rule set – use voting schemes  Rules are no longer exhaustive – A record may not trigger any rules – Solution?  Use a default class © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Ordered Rule Set  Rules are rank ordered according to their priority – An ordered rule set is known as a decision list  When a test record is presented to the classifier – It is assigned to the class label of the highest ranked rule it has triggered – If none of the rules fired, it is assigned to the default class R1: (Give Birth = no)  (Can Fly = yes)  Birds R2: (Give Birth = no)  (Live in Water = yes)  Fishes R3: (Give Birth = yes)  (Blood Type = warm)  Mammals R4: (Give Birth = no)  (Can Fly = no)  Reptiles R5: (Live in Water = sometimes)  Amphibians Name Blood Type Give Birth Can Fly Live in Water Class turtle cold no no sometimes ? © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Rule Ordering Schemes  Rule-based ordering – Individual rules are ranked based on their quality  Class-based ordering – Rules that belong to the same class appear together Rule-based Ordering Class-based Ordering (Refund=Yes) == No (Refund=Yes) == No (Refund=No, Marital Status=Single,Divorced, (Refund=No, Marital Status=Single,Divorced, Taxable Income80K) == No Taxable Income80K) == No (Refund=No, Marital Status=Single,Divorced, (Refund=No, Marital Status=Married) == No Taxable Income80K) == Yes (Refund=No, Marital Status=Single,Divorced, (Refund=No, Marital Status=Married) == No Taxable Income80K) == Yes © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Building Classification Rules  Direct Method:  Extract rules directly from data  e.g.: RIPPER, CN2, Holte’s 1R  Indirect Method:  Extract rules from other classification models (e.g. decision trees, neural networks, etc).  e.g: C4.5rules © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Direct Method: Sequential Covering 1. Start from an empty rule 2. Grow a rule using the Learn-One-Rule function 3. Remove training records covered by the rule 4. Repeat Step (2) and (3) until stopping criterion is met © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Example of Sequential Covering (i) Original Data (ii) Step 1 © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Example of Sequential Covering… R1 R1 R2 (iii) Step 2 (iv) Step 3 © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Aspects of Sequential Covering  Rule Growing  Instance Elimination  Rule Evaluation  Stopping Criterion  Rule Pruning © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Rule Growing  Two common strategies Yes: 3 No: 4 Refund=No, Refund=No, Status=Single, Status=Single, Income=85K Income=90K (Class=Yes) (Class=Yes) Income Refund= Status = Status = Status = ... 80K No Single Divorced Married Refund=No, Status = Single (Class = Yes) Yes: 3 Yes: 2 Yes: 1 Yes: 0 Yes: 3 No: 4 No: 1 No: 0 No: 3 No: 1 (b) Specific-to-general (a) General-to-specific © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Rule Growing (Examples)  CN2 Algorithm: – Start from an empty conjunct: – Add conjuncts that minimizes the entropy measure: A, A,B, … – Determine the rule consequent by taking majority class of instances covered by the rule  RIPPER Algorithm: – Start from an empty rule: = class – Add conjuncts that maximizes FOIL’s information gain measure:  R0: = class (initial rule)  R1: A = class (rule after adding conjunct)  Gain(R0, R1) = t log (p1/(p1+n1)) – log (p0/(p0 + n0))  where t: number of positive instances covered by both R0 and R1 p0: number of positive instances covered by R0 n0: number of negative instances covered by R0 p1: number of positive instances covered by R1 n1: number of negative instances covered by R1 © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›Instance Elimination  Why do we need to eliminate instances? R3 R2 – Otherwise, the next rule is R1 + + + + identical to previous rule + + + + + + + + +  Why do we remove + + + class = + + + + + + + positive instances? + + + + + + + - - - – Ensure that the next rule is - - - - - - - different - - - class = -  Why do we remove - - - negative instances? - - - - - – Prevent underestimating accuracy of rule – Compare rules R2 and R3 in the diagram © Tan,Steinbach, Kumar Introduction to Data Mining 4/18/2004 ‹›