Neural Network Capacity with Delta Learning and Linear Thresholds
Abstract
This thesis examines the memory capacities of generalized neural networks. Hopfield networks trained with Hebbian learning as well as networks additionally adjusted with Delta learning are investigated in their binary forms and extended with linear thresholding. These are examined to better understand the capacity of the human brain and the capabilities of artificial intelligence. Greater artificial neural network capacity allows for more computation with less storage space. New methods are proposed to increase Hopfield network capacities, and the scalability of these methods is also examined in respect to size of the network. The ability to recall entire patterns from stimulation of a single neuron is also examined for the increased capacity networks.
Collections
- OSU Theses [15752]