Post-processing methods for delay embedding and feature scaling of reservoir computers
Abstract Reservoir computing is a machine learning method that is well-suited for complex time series prediction tasks. Both delay embedding and the projection of input data into a higher-dimensional space play important roles in enabling accurate predictions. We establish simple post-processing met...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | Communications Engineering |
Online Access: | https://doi.org/10.1038/s44172-024-00330-0 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Abstract Reservoir computing is a machine learning method that is well-suited for complex time series prediction tasks. Both delay embedding and the projection of input data into a higher-dimensional space play important roles in enabling accurate predictions. We establish simple post-processing methods that train on past node states at uniformly or randomly-delayed timeshifts. These methods improve reservoir computer prediction performance through increased feature dimension and/or better delay embedding. Here we introduce the multi-random-timeshifting method that randomly recalls previous states of reservoir nodes. The use of multi-random-timeshifting allows for smaller reservoirs while maintaining large feature dimensions, is computationally cheap to optimise, and is our preferred post-processing method. For experimentalists, all our post-processing methods can be translated to readout data sampled from physical reservoirs, which we demonstrate using readout data from an experimentally-realised laser reservoir system. |
---|---|
ISSN: | 2731-3395 |