Efficient optimisation of physical reservoir computers using only a delayed input
Abstract Reservoir computing is a machine learning algorithm for processing time dependent data which is well suited for experimental implementation. Tuning the hyperparameters of the reservoir is a time-consuming task that limits is applicability. Here we present an experimental validation of a rec...
Saved in:
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Nature Portfolio
2025-01-01
|
Series: | Communications Engineering |
Online Access: | https://doi.org/10.1038/s44172-025-00340-6 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Abstract Reservoir computing is a machine learning algorithm for processing time dependent data which is well suited for experimental implementation. Tuning the hyperparameters of the reservoir is a time-consuming task that limits is applicability. Here we present an experimental validation of a recently proposed optimisation technique in which the reservoir receives both the input signal and a delayed version of the input signal. This augments the memory of the reservoir and improves its performance. It also simplifies the time-consuming task of hyperparameter tuning. The experimental system is an optoelectronic setup based on a fiber delay loop and a single nonlinear node. It is tested on several benchmark tasks and reservoir operating conditions. Our results demonstrate the effectiveness of the delayed input method for experimental implementation of reservoir computing systems. |
---|---|
ISSN: | 2731-3395 |