Meta-Learning with SNNs on Spinnaker2 using exact gradients (EventProp)

The ability for learning models to be deployed in new and challenging conditions is a long-standing aspiration of modern AI. Models computing at the edge should be capable of both adaptation and knowledge-transfer, leveraging extensive (offline) pre-training while remaining nimble and avoiding catastrophic forgetting.  



In this workshop we want to explore the potential of meta-initialized SNNs to adapt rapidly and efficiently to out-of-distribution datasets when deployed on neuromorphic hardware. Moreover, we wish to investigate empirically and theoretically the potential superiority of using exact gradients when learning in an online manner in scarce data conditions (batch 1, small number of samples), compared to surrogate gradient approaches.



To do so, we aim at extending our existing implementation of the EventProp algorithm [1] on Spinnaker2 to leverage the model agnostic meta-learning approach described in [2].  


Available resources: 



  • Both on-chip and off-chip implementations of EventProp are ready to use and set up in a practical straightforward manner using Pytorch /SNN Torch.  

  • Meta-Learning environment (needs double check) and the augmented meta-dataset are also ready to use. 


References: 
[1]: Event-Based Backpropagation can compute Exact Gradients for Spiking Neural Networks (https://arxiv.org/abs/2009.08378)
[2]: Meta-learning Spiking Neural Networks with Surrogate Gradient Descent (https://arxiv.org/abs/2201.10777) 

Go to group wiki Go to wiki users Info

Timetable

Day Time Location
Wed, 01.05.2024 14:00 - 17:00 Disco Room

Moderator

Mahmoud Akl
Sirine Arfa
Gabriel Béna

Members

Mahmoud Akl
Gabriel Béna
Jan Finkbeiner
Patrick Govoni
Ismael Jaras
Ton Juny pina
James Knight
Melissa Lober
Antony N'dri
Eleni Nisioti
Thomas Nowotny
Andrea Ortone
Thomas Shoesmith
Alexandru Vasilache