Neural Synchrony-Based State Representation in Liquid State Machines, an Exploratory Study

Journal Menu

Journal Browser

Special Issues

Special Issue on Multidisciplinary Sciences and Advanced Technology
Guest Editors: Prof. Paul Andrew
Deadline: 30 Novermber 2025

Special Issue on Computing, Engineering and Sciences
Guest Editors: Prof. Paul Andrew
Deadline: 30 April 2025

Neural Synchrony-Based State Representation in Liquid State Machines, an Exploratory Study

by Nicolas Pajot, Mounir Boukadoum*

 Department of Computer Science, University of Quebec at Montreal, Quebec, Canada

* Author to whom correspondence should be addressed.

Journal of Engineering Research and Sciences, Volume 2, Issue 11, Page # 1-14, 2023; DOI: 10.55708/js0211001

Keywords: Liquid state machine, state representation, temporal decoding, separation property, classification

Received: 06 September 2023, Revised: 25 October 2023, Accepted: 26 November 2023, Published Online: 30 November 2023

APA Style

Pajot, N., & Boukadoum, M. (2023). Neural Synchrony-Based State Representation in Liquid State Machines, an Exploratory Study. Journal of Engineering Research and Sciences, 2(11), 1–14. https://doi.org/10.55708/js0211001

Chicago/Turabian Style

Pajot, Nicolas, and Mounir Boukadoum. “Neural Synchrony-Based State Representation in Liquid State Machines, an Exploratory Study.” Journal of Engineering Research and Sciences 2, no. 11 (November 1, 2023): 1–14. https://doi.org/10.55708/js0211001.

IEEE Style

N. Pajot and M. Boukadoum, “Neural Synchrony-Based State Representation in Liquid State Machines, an Exploratory Study,” Journal of Engineering Research and Sciences, vol. 2, no. 11, pp. 1–14, Nov. 2023, doi: 10.55708/js0211001.

149 Downloads

Solving classification problems by Liquid State Machines (LSM) usually ignores the influence of the liquid state representation on performance, leaving the role to the reader circuit. In most studies, the decoding of the internally generated neural states is performed on spike rate-based vector representations. This approach occults the interspike timing, a central aspect of biological neural coding, with potentially detrimental consequences on the LSM performance. In this work, we propose a model of liquid state representation that builds the feature vectors from temporal information extracted from the spike trains, hence using spike synchrony instead of rate. Using pairs of Poisson-distributed spike trains in noisy conditions, we show that such model outperforms a rate-only model in distinguishing two spike trains regardless of the sampling frequency of the liquid states or the noise level. In the same vein, we suggest a synchrony-based measure of the separation property (SP), a core feature of LSMs regarding classification performance, for a more robust and biologically plausible interpretation.