Noise, variability and memory formation in spiking neural networks

In order to understand the mechanism driving spiking neural networks (SNNs) to recognize signals on time series; we evolve very small spiking neural networks for recognizing signals in a particular order in presence of noise on membrane potential and variation on silent interval between signals. For recognition of 3 signals in a random input stream the SNNs consist of 3 interneurons and a single output neuron of type adaptive exponential integrate and fire. In addition, the network has 3 dedicated input channels, one for each signal. We use genetic algorithm such that the fitness function rewards for spiking after occurrences of 3 signals in intended order and penalizes spikes elsewhere. We have devised a way to map evolved SNNs on finite state transducer (FST) -- a general model of computation on time series. Furthermore, we demonstrate that SNNs evolved in presence of noise and variation are not only robust to perturbation of neuronal parameters but also emerges a form of memory (thanks to self-excitatory loops -- autapses) such that the network remembers the previous state indefinitely. Finally, we show that evolution may overproduce synaptic connections which can be pruned without impairing performance of the network. The excessive connections are neither important for recognition nor has any role in state maintenance of the network.

Go to group wiki

Timetable

Day Time Location
Fri, 26.04.2019 16:30 - 17:00 Panorama

Moderator

Borys Wrobel
Muhammad Yaqoob

Members

Karla Burelo
Matteo Cartiglia
Erika Covi
Giulia D'Angelo
Álvaro González
Qinghai Guo
Daniel Gutierrez-Galan
Sepp Kollmorgen
Renate Krause
Alexander Kugele
Dylan Muir
Mattias Nilsson
Adam Perrett
Alpha Renner
Beck Strohmer
Gemma Taverni
Pau Vilimelis Aceituno
Annika Weisse