Publication Month: January 2022, Page(s): A1 – A1
Publication Month: January 2022, Page(s): B1 – B1
Publication Month: January 2022, Page(s): C1 – C1
Publication Month: January 2022, Page(s): D1 – D1
Nicolas Pajot, Mounir Boukadoum
J. Engg. Res. & Sci. 2(11), 1-14 (2023);
Solving classification problems by Liquid State Machines (LSM) usually ignores the influence of the liquid state representation on performance, leaving the role to the reader circuit. In most studies, the decoding of the internally generated neural states is performed on spike rate-based vector representations. This approach occults the interspike timing, a central aspect of biological neural coding, with potentially detrimental consequences on the LSM performance. In this work, we propose a model of liquid state representation that builds the feature vectors from temporal information extracted from the spike trains, hence using spike synchrony instead of rate. Using pairs of Poisson-distributed spike trains in noisy conditions, we show that such model outperforms a rate-only model in distinguishing two spike trains regardless of the sampling frequency of the liquid states or the noise level. In the same vein, we suggest a synchrony-based measure of the separation property (SP), a core feature of LSMs regarding classification performance, for a more robust and biologically plausible interpretation.
This sidebar is currently being updated and may temporarily overlap with the pages.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.