Summary
Chapters
Video Info
Dr. Chirag Shah, PhD, clearly and in detail explains the classification of data using the decision tree process to make predictions. Limitations of the model, and the random forest method as a way to overcome them, are presented.
-
Chapter 1: What is a Decision Tree?
icon angle down -
Chapter 2: Parts of a Decision Tree
icon angle down -
Chapter 3: Splitting the Data to Create a Decision Tree
icon angle down -
Chapter 4: Entropy and Probability
icon angle down -
Chapter 5: Calculating Entropy Corresponding to Probability
icon angle down -
Chapter 6: Entropy and Information Gain
icon angle down -
Chapter 7: Sample Problem: Weather Outlook and Playing Golf
icon angle down -
Chapter 8: Using the ID3 Algorithm to Create a Decision Tree, Step 1: Determine the Entropy of the Target
icon angle down -
Chapter 9: Using the ID3 Algorithm to Create a Decision Tree, Step 2: Determine the Information Gain for Each Attribute
icon angle down -
Chapter 10: Using the ID3 Algorithm to Create a Decision Tree, Step 3: Determine the Attribute with the Greatest Information Gain
icon angle down -
Chapter 11: Using the ID3 Algorithm to Create a Decision Tree, Step 4: Repeat the Process until Entropy is Zero
icon angle down -
Chapter 12: Tree Structure Determined by Decision Rules
icon angle down -
Chapter 13: Shortcomings of the Decision Tree Model: Over-fitting the Data and Shortsightedness or Only Looking at the Next Possible Level
icon angle down -
Chapter 14: Addressing the Shortcomings: Pruning the Branches
icon angle down -
Chapter 15: Addressing the Shortcomings: Random Forest—Building Trees from Randomly Selected Data Points
icon angle down -
Chapter 16: Summary
icon angle down