Two different boosting algorithms are investigated, AdaBoost[3] and AdaPlusBoost which is a novel hybrid of Ada boost and RealBoost[2]. RealBoost differs from AdaBoost in two distinct ways, firstly the weak classifiers no longer return a binary classification but instead return a likelihood ratio based on the training examples and secondly, the selection of weak classifiers is not based on their performance in isolation but on the performance of the strong classifier that would be created were they to be added to the current solution. While RealBoost offers advantages in reduced classifier sizes with fewer redundant weak classifiers, it requires a more complete data set than AdaBoost in order to build up probability distributions represented as histograms. This can be partially overcome using Parzan windowing, however, this is only a solution for a near complete data set. AdaPlusBoost was devised to combine some of the advantages offered by RealBoost with AdaBoost’s robust attitude to a sparse training set. It uses the same weak classifiers as AdaBoost but takes from RealBoost the idea that the best weak classifier to choose is the best in combination rather than the best in isolation. This results in a more optimum strong classifier which can be trained on relatively small data sets.