Volume 2, Issue 11 - 1 Article

This issue explores recent progress in computational neuroscience and machine learning, with a focus on Liquid State Machines (LSMs). These brain-inspired neural networks can handle changing inputs over time. The featured research paper looks at how the way LSMs represent information affects their performance. It suggests a new model that uses spike timing patterns instead of just counting spikes. This approach is closer to how real brains work and could make LSMs better at solving classification problems. The study shows that paying attention to when spikes happen, not just how many there are, can capture important information that older methods missed.
Front Cover
Journal of Engineering Research and Sciences, Volume 2, Issue 11, Page # i–i, 2023
Editorial Board
Journal of Engineering Research and Sciences, Volume 2, Issue 11, Page # ii–ii, 2023
Editorial
by Paul Andrew
Journal of Engineering Research and Sciences, Volume 2, Issue 11, Page # iii–iii, 2023
Table of Contents
Journal of Engineering Research and Sciences, Volume 2, Issue 11, Page # iv–iv, 2023
Neural Synchrony-Based State Representation in Liquid State Machines, an Exploratory Study
by Nicolas Pajot and Mounir Boukadoum
Journal of Engineering Research and Sciences, Volume 2, Issue 11, Page # 1-14, 2023; DOI: 10.55708/js0211001
Abstract: Solving classification problems by Liquid State Machines (LSM) usually ignores the influence of the liquid state representation on performance, leaving the role to the reader circuit. In most studies, the decoding of the internally generated neural states is performed on spike rate-based vector representations. This approach occults the interspike timing, a central aspect of biological… Read More
(This article belongs to the Special Issue on SP4 (Special Issue on Computing, Engineering and Sciences 2023-24) and the Section Neurosciences (NES))