1-bit embeddings

 


It seems as if the mainstream ANN & LLM models and techniques are moving inexorably towards sparser and lower-resolution encodings and representations.  In this sense, they are converging on sparse 1-bit representations and learning mechanisms that have historically been inspired by computational neuroscience and which have at least a 35 year history with certain milestones described by spike timing codes, sparse distributed memories, rank order codes, sparse N-of-M codes and more recently hyperdimensional computing/VSAs and the BitBrain mechanism. We are going to suggest that instead of going via an expensive and circuitous route and entering this very interesting space indirectly through the back door, there is great value in developing and exploring the more direct route and entering through the front door! 


As well as the usually quoted engineering benefits such as latency, energy and a natural match to event-based sensors and neuromorphic compute, we believe that there are others less often discussed such as great robustness in the presence of internal or external noise and errors, opportunities for continuous learning and dealing with 'the binding problem'.


 


References


https://en.wikipedia.org/wiki/Sparse_distributed_memory


https://www.researchgate.net/publication/11686294_Spike-based_strategies_for_rapid_processing


https://www.sciencedirect.com/science/article/abs/pii/S0893608004001443?via%3Dihub


https://pubmed.ncbi.nlm.nih.gov/17526333/


https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2022.971937/full


https://www.hd-computing.com/


https://www.frontiersin.org/articles/10.3389/fninf.2023.1125844/full


 

Go to group wiki Go to wiki users Info

Timetable

Day Time Location
Wed, 01.05.2024 15:00 - 16:00 Sala panorama
Thu, 09.05.2024 15:00 - 16:00 Sala panorama

Moderator

Jakub Fil
Michael Hopkins

Members

Santiago Díaz Romero
Christian Fernandez Lorden
Jan Finkbeiner
Michael Hopkins
Edward Jones
Melissa Lober
Filippo Moro
Eleni Nisioti
Naresh Ravichandran