The ability for learning models to be deployed in new and challenging conditions is a long-standing aspiration of modern AI. Models computing at the edge should be capable of both adaptation and knowledge-transfer, leveraging extensive (offline) pre-training while remaining nimble and avoiding catastrophic forgetting.
In this workshop we want to explore the potential of meta-initialized SNNs to adapt rapidly and efficiently to out-of-distribution datasets when deployed on neuromorphic hardware. Moreover, we wish to investigate empirically and theoretically the potential superiority of using exact gradients when learning in an online manner in scarce data conditions (batch 1, small number of samples), compared to surrogate gradient approaches.
To do so, we aim at extending our existing implementation of the EventProp algorithm [1] on Spinnaker2 to leverage the model agnostic meta-learning approach described in [2].
Available resources:
References:
[1]: Event-Based Backpropagation can compute Exact Gradients for Spiking Neural Networks (https://arxiv.org/abs/2009.08378)
[2]: Meta-learning Spiking Neural Networks with Surrogate Gradient Descent (https://arxiv.org/abs/2201.10777)
Go to group wiki Go to wiki users Info
Day | Time | Location |
---|---|---|
Wed, 01.05.2024 | 14:00 - 17:00 | Disco Room |