dc.contributor.author | Stowe, Matthew David | |
dc.date.accessioned | 2014-09-24T14:18:40Z | |
dc.date.available | 2014-09-24T14:18:40Z | |
dc.date.issued | 2013-07-01 | |
dc.identifier.uri | https://hdl.handle.net/11244/11190 | |
dc.description.abstract | This thesis examines the memory capacities of generalized neural networks. Hopfield networks trained with Hebbian learning as well as networks additionally adjusted with Delta learning are investigated in their binary forms and extended with linear thresholding. These are examined to better understand the capacity of the human brain and the capabilities of artificial intelligence. Greater artificial neural network capacity allows for more computation with less storage space. New methods are proposed to increase Hopfield network capacities, and the scalability of these methods is also examined in respect to size of the network. The ability to recall entire patterns from stimulation of a single neuron is also examined for the increased capacity networks. | |
dc.format | application/pdf | |
dc.language | en_US | |
dc.publisher | Oklahoma State University | |
dc.rights | Copyright is held by the author who has granted the Oklahoma State University Library the non-exclusive right to share this material in its institutional repository. Contact Digital Library Services at lib-dls@okstate.edu or 405-744-9161 for the permission policy on the use, reproduction or distribution of this material. | |
dc.title | Neural Network Capacity with Delta Learning and Linear Thresholds | |
dc.type | text | |
osu.filename | Stowe_okstate_0664M_12858.pdf | |
osu.accesstype | Open Access | |
dc.description.department | Computer Science | |
dc.type.genre | Thesis | |