Perspectives on Soft Actor–Critic (SAC)-Aided Operational Control Strategies for Modern Power Systems with Growing Stochastics and Dynamics

The ever-growing penetration of renewable energy with substantial uncertainties and stochastic characteristics significantly affects the modern power grid’s secure and economical operation. Nevertheless, coordinating various types of resources to derive effective online control decisions for a large...

Full description

Saved in:
Bibliographic Details
Main Authors: Jinbo Liu, Qinglai Guo, Jing Zhang, Ruisheng Diao, Guangjun Xu
Format: Article
Language:English
Published: MDPI AG 2025-01-01
Series:Applied Sciences
Subjects:
Online Access:https://www.mdpi.com/2076-3417/15/2/900
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The ever-growing penetration of renewable energy with substantial uncertainties and stochastic characteristics significantly affects the modern power grid’s secure and economical operation. Nevertheless, coordinating various types of resources to derive effective online control decisions for a large-scale power network remains a big challenge. To tackle the limitations of existing control approaches that require full-system models with accurate parameters and conduct real-time extensive sensitivity-based analyses in handling the growing uncertainties, this paper presents a novel data-driven control framework using reinforcement learning (RL) algorithms to train robust RL agents from high-fidelity grid simulations for providing immediate and effective controls in a real-time environment. A two-stage method, consisting of offline training and periodic updates, is proposed to train agents to enable robust controls of voltage profiles, transmission losses, and line flows using a state-of-the-art RL algorithm, soft actor–critic (SAC). The effectiveness of the proposed RL-based control framework is validated via comprehensive case studies conducted on the East China power system with actual operation scenarios.
ISSN:2076-3417