Show simple item record

dc.contributor.authorStowe, Matthew David
dc.date.accessioned2014-09-24T14:18:40Z
dc.date.available2014-09-24T14:18:40Z
dc.date.issued2013-07-01
dc.identifier.urihttps://hdl.handle.net/11244/11190
dc.description.abstractThis thesis examines the memory capacities of generalized neural networks. Hopfield networks trained with Hebbian learning as well as networks additionally adjusted with Delta learning are investigated in their binary forms and extended with linear thresholding. These are examined to better understand the capacity of the human brain and the capabilities of artificial intelligence. Greater artificial neural network capacity allows for more computation with less storage space. New methods are proposed to increase Hopfield network capacities, and the scalability of these methods is also examined in respect to size of the network. The ability to recall entire patterns from stimulation of a single neuron is also examined for the increased capacity networks.
dc.formatapplication/pdf
dc.languageen_US
dc.publisherOklahoma State University
dc.rightsCopyright is held by the author who has granted the Oklahoma State University Library the non-exclusive right to share this material in its institutional repository. Contact Digital Library Services at lib-dls@okstate.edu or 405-744-9161 for the permission policy on the use, reproduction or distribution of this material.
dc.titleNeural Network Capacity with Delta Learning and Linear Thresholds
dc.typetext
osu.filenameStowe_okstate_0664M_12858.pdf
osu.accesstypeOpen Access
dc.description.departmentComputer Science
dc.type.genreThesis


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record