The algorithm gradient boosted regression trees, also called random forests,
belong to the ensemble learning methods This classifier uses an ensemble of
weak regression trees that have a low hit quota when considered in isola-
tion. The quality of the prediction can be improved significantly when vari-
ous trees are trained with different parameters or samples. The results of the
individual trees are aggregated to a total result which then enables a more
balanced and high-quality prediction. The so-called bagging triggered a
boom of the traditional regression trees. As aggregation, either a majority
vote or a probability function is chosen (Fig. 5.5).
The lead prediction generates high-conversion leads because
• The entire spectrum of information available about a company is inte-
grated into the decision-making;
• The data is highly topical and without bias;
• The random forest is capable of abstracting complex correlations in the
data; and
• The method learns iteratively from the interaction with the sales team.
The choice of leads is the first step in the sales process; the second one is to
find the ideal point in time for addressing them.
Comments
Post a Comment