Hougen, DeanMorgan, Brandon2022-07-292022-07-292022-08https://hdl.handle.net/11244/336288NASNet and AmoebaNet are state-of-the-art neural architecture search systems that were able to achieve better accuracy than state-of-the-art human-made convolutional neural networks. Despite the innovation of the NASNet search space, it lacks the ability to express flexibility in terms of optimizing non-convolutional operation layers, such as batch normalization, activation, and dropout. These layers are hand designed by the architect prior to optimization, limiting the exploration possible for model architectures by narrowing down the search space. In addition, the NASNet search space can not allow for many non-classical optimization techniques to be applied as it lacks the ability to be expressed in a fixed-length, floating-point, multidimensional array. Lastly, both NASNet and AmoebaNet use an extensive amount of computation, both evaluating 20,000 models during optimization, consuming 2,000 GPU hours worth of computation. This work addresses these limitations by, first, changing the NASNet search space to include optimization of non-convolutional operation layers through the addition of a building block that allows for the optimization for the order and inclusion of these layers; second, proposing a fixed-length, floating-point, multidimensional array representation to allow other non-classical optimization techniques, such as particle swarm optimization, to be applied; and third, proposing an efficient genetic algorithm, while using state of-the-art techniques to reduce training comiv plexity. After only 1,300 models evaluated, consuming 190 GPU hours, evolving on the CIFAR-10 benchmark dataset, the best model configuration yielded a test accuracy of 94.6% with only 1.3 million parameters, and a test accuracy of 95.09% with only 5.17 million parameters, outperforming both ResNet110 and WideResNet. When transferring to the CIFAR-100 benchmark dataset, the best model configuration yielded a test accuracy of 71.1% with only 1.3 million parameters, and a test accuracy of 76.53% with only 5.17 million parameters.Attribution-NonCommercial-ShareAlike 4.0 InternationalNeural Architecture SearchGenetic AlgorithmsParticle Swarm OptimizationEfficient Neural Architecture Search using Genetic Algorithm