Improved Threshold Selection by Using Calibrated Probabilities for Random Forest Classifiers

Florian Baumann, Jinghui Chen, Karsten Vogt, Bodo Rosenhahn

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Scopus citations


Random Forest is a well-known ensemble learning method that achieves high recognition accuracies while preserving a fast training procedure. To construct a Random Forest classifier, several decision trees are arranged in a forest while a majority voting leads to the final decision. In order to split each node of a decision tree into two children, several possible variables are randomly selected while a splitting criterion is computed for each of them. Using this pool of possible splits, the Random Forest algorithm selects the best variable according to the splitting criterion. Often, this splitting is not reliable leading to a reduced recognition accuracy. In this paper, we propose to introduce an additional condition for selecting the best variable leading to an improvement of the recognition accuracy, especially for a smaller number of trees. We enhance the standard threshold selection by a quality estimation that is computed using a calibration method. The proposed method is evaluated on datasets for machine learning as well as object recognition.

Original languageEnglish (US)
Title of host publicationProceedings -2015 12th Conference on Computer and Robot Vision, CRV 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781479919864
StatePublished - Jul 14 2015
Event12th Conference on Computer and Robot Vision, CRV 2015 - Halifax, Canada
Duration: Jun 3 2015Jun 5 2015

Publication series

NameProceedings -2015 12th Conference on Computer and Robot Vision, CRV 2015


Conference12th Conference on Computer and Robot Vision, CRV 2015

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Cite this