Abstract:
The escalating penetration rate of emerging renewable energy sources exacerbates the inherent volatility and stochastic nature of power systems, thereby presenting formidable challenges to the safe and economic operation of the power system. To address this challenge, this paper presents an improved generative adversarial imitation learning algorithm tailored to power systems' real-time security-constrained economic dispatch. First, the security-constrained economic dispatch problem of renewable energy-integrated power systems is formulated as a Markov decision process. Second, recognizing the limitations of conventional deep reinforcement learning algorithms, notably high training time consumption and pronounced design subjectivity, this paper employs a generative adversarial imitation learning algorithm to address this Markov decision process. Additionally, an improved generative adversarial imitation learning algorithm is proposed, which renders the generative adversarial imitation learning algorithm compatible with various off-policy deep reinforcement learning algorithms through a dual buffer mechanism. In the proposed algorithm, the combination with the soft Actor-Critic algorithm significantly enhances the training performance. The simulation results illustrate that the proposed algorithm not only markedly accelerates the convergence speed during offline training but also improves the economy and security in online decision-making compared to traditional algorithms while ensuring a millisecond-level decision speed.