1 / 11

Classification and Regression trees (CART)

This presentation educates you about Classification and<br>Regression trees (CART), CART decision tree methodology, Classification Trees, Regression Trees, Differences in CART, When to use CART?, Advantages of CART, Limitations of CART and What is a CART in Machine Learning?.<br><br>For more topics stay tuned with Learnbay.

Download Presentation

Classification and Regression trees (CART)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Classification and Regression trees (CART) Swipe

  2. CART decision tree methodology Decision Trees are commonly used in data mining with the objective of creating a model that predicts the value of a target (or dependent variable) based on the values of several input (or independent variables). In today's post, we discuss the CART decision tree methodology. The CART or Classification & Regression Trees methodology was introduced in 1984 by Leo Breiman, Jerome Friedman, Richard Olshen and Charles Stone as an umbrella term to refer to the following types of decision trees.

  3. Classification Trees A classification tree is an algorithm where the target variable is fixed or categorical. The algorithm is then used to identify the “class” within which a target variable would most likely fall. where the target variable is categorical and the tree is used to identify the "class" within which a target variable would likely fall into.

  4. Regression Trees A regression tree refers to an algorithm where the target variable is and the algorithm is used to predict its value. As an example of a regression type problem, you may want to predict the selling prices of a residential house, which is a continuous dependent variable. where the target variable is continuous and tree is used to predict it's value.

  5. Differences CART Decision trees are easily understood and there are several classification and regression trees ppts to make things even simpler. However, it’s important to understand that there are some fundamental differences between classification and regression trees.

  6. When to use CART? Classification trees are used when the dataset needs to be split into classes that belong to the response variable. In many cases, the classes Yes or No. In other words, they are just two and mutually exclusive. In some cases, there may be more than two classes in which case a variant of the classification tree algorithm is used.

  7. Regression trees, on the other hand, are used when the response variable is continuous. For instance, if the response variable is something like the price of a property or the temperature of the day, a regression tree is used. In other words, regression trees are used for prediction-type problems while classification trees are used for classification-type problems.

  8. Advantages of CART The Results are Simplistic Classification and Regression Trees are Nonparametric & Nonlinear Classification and Regression Trees Implicitly Perform Feature Selection

  9. Limitations of CART Overfitting High variance Low bias

  10. What is a CART in Machine Learning? A Classification and Regression Tree(CART) is a predictive algorithm used in machine learning. It explains how a target variable’s values can be predicted based on other values. It is a decision tree where each fork is split in a predictor variable and each node at the end has a prediction for the target variable. The CART algorithm is an important decision tree algorithm that lies at the foundation of machine learning. Moreover, it is also the basis for other powerful machine learning algorithms like bagged decision trees, random forest, and boosted decision trees.

  11. Topics for next Post Discriminant Analysis Factor Analysis Linear Regression Stay Tuned with

More Related