Study of ensemble learning with AdaBoost and NEAT
Abstract
Neural networks, ensemble algorithms, neuroevolution, and genetic algorithms all have shown the ability to solve classification problems in many areas of application. One of the problems in training neural networks is the significant amount of time the training can take. AdaBoost (i.e., Adaptive Boosting) is an algorithm that has been combined with other neural network training algorithms to form ensemble algorithms that have the goal of reducing training times, while retaining the accuracy level of neural network outputs. Another concern with neural networks is that it can be difficult to determine an effective topology for solving particular problems. Neuroevolution can be used to address this issue; neuroevolution uses genetic algorithms to evolve characteristics of a neural network, one of which is its topology. The focus of this study is to investigate whether AdaBoost can be combined with the genetic algorithms of neuroevolution to decrease the time needed to evolve neural networks, and what effects this would have on the accuracy of the results.
Collections
- OSU Dissertations [11222]