6

Decision Tree with Gini Index as Impurity Measure

 2 years ago
source link: https://www.geeksforgeeks.org/videos/decision-tree-with-ginni-index-as-impurity-measure/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Decision Tree with Gini Index as Impurity Measure

Decision Tree with Gini Index as Impurity Measure
Video Player is loading.
Loaded: 0%
Remaining Time -0:00
  • 04/05/2022

Machine learning is the study of algorithms that can improve automatically through experiences by using trained data. And while creating the Machine learning model with those data or models, it should be clean and classified in the proper ways. In this video, we will discuss the Decision Tree with the Gini Impurity Measure.

A decision tree is supervised learning which is used for both classification and regression problems. Generally, we use tree representation to solve the given question for each leaf node corresponding to a class label, and attributes are represented on the internal node of the tree.

Decision trees are based on many classical machine learning algorithms - Random Forests, Bagging and Boosted Decision Trees. It is now widely used in machine learning for predictive modeling, including both classification and regression.

To estimate the impurity in a decision node, we have two factors - one is Entropy and the second is Gini Impurity

Entropy - It is randomness and it is an information theory that quantifies the impurity in a group observation. It also determines the decision tree on how to choose the split data so we can get the Leaf nodes. We will discuss the formula within the video to get an entropy with an example.

Information Gain - It helps to get the order of attributes in the node of a decision tree. Information gain is a measure of the change in entropy. Gini impurity - It is used to build Decision Trees to determine how the features of a dataset should split nodes to form the tree


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK