Are long time-constant Necessary or even Useful for computation?

Long time-constants are expensive and difficult to implement in hardware. Can long-term memory be formed without them? In other words, are long time-constants necessary? 

While perhaps not necessary, bio-inspired mechanisms are arising in SNNs adding long time constant to improve the computational power (Adaptation [1], Delays [2]). So maybe long time-constants are useful after all.

Let's discuss algorithms and neuroscience to find an answer!

[1] A surrogate gradient spiking baseline for speech command recognition

[2] Learning Delays in SNNs using Dilated Comvolutions with Learnable Spacings

Go to group wiki Go to wiki users Info


Day Time Location
Thu, 09.05.2024 21:30 - 22:30 Sala Panorama


Filippo Moro