Loading...
Thumbnail Image

Date

2020

Journal Title

Journal ISSN

Volume Title

Publisher

Creative Commons
Except where otherwise noted, this item's license is described as Attribution-NonCommercial-NoDerivatives 4.0 International

Spiking neural networks (SNNs) attempt to computationally model biological neurons. While similar to artificial neural networks (ANNs), SNNs preserve the temporal and binary aspects of neurons. Computational evolution is also a biologically inspired computing method, and it has been used to evolve neural networks. NeuroEvolution of Augmenting Topologies (NEAT) is a method to simultaneously evolve the structure and weights of a ANNs. In this work, I apply the NEAT algorithm to SNNs. I compare the performance of ANNs evolved with NEAT and SNNs evolved with NEAT on XOR, a cosine function, and the single pole balancing problem. Multiple values are used for the compatibility threshold (3 options), compatibility weight coefficient (2 options), compatibility disjoint coefficient (2 options), and spiking threshold (2 options). On the XOR problem, 15 SNNs with different parameter combinations found solutions on all five test repetitions while only two ANN parameter combinations did. On the cosine problem, only one SNN parameter combination found a solution on every repetition, but all ANNs did. However, the successful SNNs appeared to capture more of the nonlinearity of the cosine curve than the ANNs. On the single pole balancing problem, no SNNs found any solution while many ANNs were able to find solutions on multiple repetitions. The results indicate that SNNs evolved with NEAT can solve and perform comparably to ANNs evolved with NEAT on some problems.

Description

Keywords

spiking neural networks, neuroevolution, NEAT

Citation

DOI

Related file

Notes

Sponsorship

Collections