Speaker
Description
Many machine learning (ML) algorithms require complete knowledge of the hardware in which they run. They also require detailed control of the hardware physical properties.
Reservoir Computing (RC) is a ML algorithm that can be implemented with minimal control of the physical properties of the hardware. In RC, a non-linear dynamical system acts as a generator of an infinite features dictionary, which is fed into a liner regression layer. The former is the only component that needs to be tuned for learning. This algorithm excels at processing information generated by dynamical systems using observed time-series data. It requires very small training data sets, uses linear optimization, and thus requires minimal computing resources.
The relevance of this type of algorithm is motivated by the reemergence of analog hardware as an alternative solution for specialized ML applications. In particular, neuromorphic hardware, using combination of analog and digital elements, are becoming increasingly competitive in ML applications, offering high-speed, low-footprint, and low-power solutions.
In this talk I will introduce Reservoir Computing, showcase its implementation in current digital computers, and then discuss the advantages of embedding the algorithm in specialized hardware.
References
Carbajal et al. (2015). Memristor models for machine learning. Neural Computation, 27(3). https://doi.org/10.1162/NECO_a_00694
Caravelli & Carbajal (2018). Memristors for the Curious Outsiders. Technologies, 6(4), 118. https://doi.org/10.3390/technologies6040118
Gauthier et al. (2021). Next generation reservoir computing. Nature Communications, 12(1), 5564. https://doi.org/10.1038/s41467-021-25801-2