In information theory, the entropy is a measure of impurity, uncertainty or randomness in a dataset. In datasets with binary classes, where variables can only have two possible outcome values, the entropy value lies between 0 and 1, inclusive. The higher the entropy, the more impure the dataset is.
Learn more\(E(X) = - \sum_{i=1}^{n} (p_i \log_2(p_i)) ;\)
Where:
Binary entropy function
Entropy calculator
Classes | Nr. of instances |
---|---|
+
|
10
|
p(Class 1): | 0.50 |
p(Class 2): | 0.50 |
E(Attribute): |
1.00
|