# of Trees: 1

Tree Accuracy: 60%

How the majority vote and well-placed randomness can enhance the decision tree model.



A Smart Idea from 1785

In 1785, a thinker named Condorcet had an idea about groups making decisions. He said that if each person is right more than half the time, a group's majority choice is likely to be correct. This idea helps in machine learning too. Let's see: imagine one decision tree that's 60% accurate.

This is called Condorcet's Jury Theorem. It means that if each voter is right more than 50% of the time, adding more voters makes the group's decision better. So, if we add two more trees (each 60% accurate) and they vote, the group's accuracy becomes 65%!

The idea shows that adding more models can make the group's prediction better. With 11 models, it can be 75% accurate! But watch out: if all models make the same mistake, the group will still be wrong. So, we need different models.

We can keep adding more models. Try the sliders! The top one changes the number of trees. The bottom one changes each tree's accuracy. In machine learning, when many models team up for a prediction, it's called ensemble learning. This is how random forests and other models work.