BitBrain and other methods for single-pass and continuous learning

Many contemporary machine learning/AI methods require extensive computation for training which consumes a great deal of time and energy resource. Their inference mechanism also tends to have a large energy-latency product. We will discuss methods that can learn in a single-pass. This usually also promotes an ability to learn continuously and, if necessary, forget earlier information if it is considered that the problem is changing over time.

It might be a good idea for participants to watch the NICE 2022 video as some background to BitBrain so that we can start with a shared basic foundation for the ideas:

Go to group wiki Go to wiki users Info


Day Time Location
Wed, 04.05.2022 14:00 - 15:00 Lecture room
Mon, 09.05.2022 15:00 - 16:00 Lecture room


Michael Hopkins


Matteo Cartiglia
Jakub Fil
Michael Hopkins
Joscha Ilmberger
Edward Jones
Willian Soares girao
Simon Thorpe