Daily Archives: February 27, 2016


Decision Tree Flavors: Gini Index and Information Gain

Summary: The Gini Index is calculated by subtracting the sum of the squared probabilities of each class from one.  It favors larger partitions.  Information Gain multiplies the probability of the class times the log (base=2) of that class probability.  Information Gain favors smaller partitions with many distinct values.  Ultimately, you have to experiment with your data […]

Information Gain would Select the Number of Images variable while Gini Index would select the more compact Average Token Length.