Evolving very small SNNs for simple computational tasks, robust to noise and damage

We are interested in using artificial evolution (that can change the topology and weights) to obtain very small spiking neural networks. Very small means 3-10 adaptive exponential or leaky integrate and fire. Simple computational tasks may include temporal pattern recognition, controlling an animat (a simulated robot), or a multiplicative operation. When such networks are evolved with noise on state variables (membrane voltage), they seem to be robust to changes of neural parameters (AdEx parameters) and weights in the network. We will be interested to see if we could test the evolved networks on neuromorphic hardware, to see if they are robust to variability inherent in the hardware.

Go to group wiki

Timetable

Day Time Location
Wed, 25.04.2018 15:00 - 16:00 Main lecture room
Thu, 26.04.2018 11:00 - 12:00 Panorama
Fri, 27.04.2018 14:00 - 15:00 Panorama

Moderator

Borys Wrobel

Members

chama bensmail
Ismael Tito Freire González
Álvaro González
Julio Guillen
jacques kaiser
aamir khan
Jamie Knight
Sepp Kollmorgen
Brent Komer
Shih-Chii Liu
junwen luo
Mattias Nilsson
James O'Keeffe
Vivek Parmar
Andrew Rowley
Korbinian Schreiber
Juan Camilo Vasquez Tieck
Jayawan Wijekoon
Borys Wrobel
Jingyue Zhao