冯蕊, 王彤, 齐宏志, 戴月, 李斌, 彭三三. 基于双注意力机制时间序列预测RNN网络的电力负荷预测方法[J]. 电力大数据, 2022, 25(7): 1-9. DOI: 10.19317/j.cnki.1008-083x.2022.07.006
引用本文: 冯蕊, 王彤, 齐宏志, 戴月, 李斌, 彭三三. 基于双注意力机制时间序列预测RNN网络的电力负荷预测方法[J]. 电力大数据, 2022, 25(7): 1-9. DOI: 10.19317/j.cnki.1008-083x.2022.07.006
FENG Rui, WANG Tong, QI Hong-zhi, DAI Yue, LI Bin, PENG San-san. Power Load Forecasting Method Based on Dual Attention Mechanism Time Series Forecasting RNN Network[J]. Power Systems and Big Data, 2022, 25(7): 1-9. DOI: 10.19317/j.cnki.1008-083x.2022.07.006
Citation: FENG Rui, WANG Tong, QI Hong-zhi, DAI Yue, LI Bin, PENG San-san. Power Load Forecasting Method Based on Dual Attention Mechanism Time Series Forecasting RNN Network[J]. Power Systems and Big Data, 2022, 25(7): 1-9. DOI: 10.19317/j.cnki.1008-083x.2022.07.006

基于双注意力机制时间序列预测RNN网络的电力负荷预测方法

Power Load Forecasting Method Based on Dual Attention Mechanism Time Series Forecasting RNN Network

  • 摘要: 电力负荷作为电力系统稳定运行的重要指标,因其非线性和时序性而难以实现精准预测。针对普通非线性外生自回归(NARX)模型很少能够恰当地捕捉到长期的时间依赖关系,并选择相关的驱动序列进行预测,本文建立了基于双注意力机制的递归长短期记忆(DA-RNN)神经网络。在编码器阶段,使用长短期记忆(LSTM)结构作为输入数据映射到隐状态的非线性函数,捕获隐状态的长期依赖关系。同时引入输入注意机制来自适应地提取输入特征,即计算相关驱动序列在预测目标序列时对于驱动序列的注意力权重。在解码器阶段,使用LSTM结构解码之前,本文加入了时间注意机制,自适应地选择相关编码器隐藏状态在预测目标序列时对于时间的重要性权重。利用这种双阶段注意力机制RNN网络,本文的模型不仅可以有效地进行电力负荷预测,而且易于解释、具有较好的鲁棒性。

     

    Abstract: As an important index of stable operation of power system, power load is difficult to achieve accurate prediction due to its nonlinearity and timing. As the ordinary nonlinear autoregressive exogenous(NARX) model is difficult to properly capture the long-term time dependence and select the relevant driving series for prediction, a recursive LSTM neural network based on double attention mechanism is established in this paper. In the encoder stage, the LSTM structure is used as the nonlinear function of the input data mapped to the hidden state to capture the long-term dependence of the hidden state. At the same time, the input attention mechanism is introduced to adaptively extract the input features, that is, to calculate the attention weight of the relevant driving series to the driving sequence when predicting the target sequence. In the decoder stage, before decoding with LSTM structure, this paper adds a time attention mechanism to adaptively select the importance weight of the hidden state of the relevant encoder to time when predicting the target sequence. Using this two-stage attention mechanism RNN network, the model in this paper can not only effectively predict power load, but also be easy to explain and have good robustness.

     

/

返回文章
返回