Seq2Seq 模型的短期水位预测
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TV124

基金项目:

国家重点研发计划(2018YFC0407902)


Short-term water level prediction based on Seq2Seq model
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 文章评论
    摘要:

    为有效预测未来一定时间内的连续水位,提出了基于序列到序列(Seq2Seq) 的短期水位预测模型,并使用一个长短期记忆神经网络(LSTM)作为编码层,将历史水位序列编码为一个上下文向量,使用另一个LSTM 作为解码层,将上下文向量解码来预测目标水位序列。以流溪河为研究对象,针对不同预测长度分别建立水位预测模型,并与LSTM 模型和人工神经网络(ANN)模型进行了对比。结果表明:Seq2Seq 模型对连续6 h、12 h 和24 h 水位预测的纳什效率系数最高分别为0.93、0.90和0.85;当预测长度为6 h 时,LSTM 和Seq2Seq 模型预测结果相似,ANN 模型精度较低;当预测长度为12 h 和24 h 时,Seq2Seq 模型相比LSTM 模型和ANN 模型预测效果更好,收敛速度更快。

    Abstract:

    In order to effectively predict the continuous water level in a certain period in advance, a short-term water level prediction model based on sequence to sequence (Seq2Seq) was proposed. The model used a long short-term memory (LSTM)network to encoder the historical water level sequence into a context vector, and another LSTM network was used to decode the context vector to predict the target water level sequence. Taking the Liuxi River as the research object, the water level prediction model was established for different prediction lengths, and compared with LSTM model and artificial neural network (ANN) model. The results show that the highest Nash-Sutcliffe efficiency coefficient predicted by the Seq2Seq model for continuous 6 h, 12 h and 24 h is 0. 93, 0. 90 and 0. 85, respectively. When the prediction length is 6h, the performance of the LSTM and Seq2Seq model are similar, and the ANN model has lowest precision. When the prediction length is 12h and 24h, the Seq2Seq model has better performance and faster convergence speed compared with the LSTM model and ANN model.

    参考文献
    相似文献
    引证文献
引用本文

刘艳,张婷,康爱卿,等. Seq2Seq 模型的短期水位预测[J].水利水电科技进展,2022,42(3):57-63.(LIU Yan, et al. Short-term water level prediction based on Seq2Seq model[J]. Advances in Science and Technology of Water Resources,2022,42(3):57-63.(in Chinese))

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2022-05-16
  • 出版日期: