基于CNN和Transformer的轻量化电能质量扰动识别模型
CSTR:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TM743

基金项目:

四川省科技计划资助项目(2023YFG0191)


A lightweight power quality disturbance recognition model based on CNN and Transformer
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    针对目前基于深度学习的电能质量扰动(power quality disturbances,PQDs)识别模型参数量多和计算复杂度较高的问题,文中提出了一种卷积神经网络(convolutional neural networks,CNN)融合Transformer(CNN and Transformer,CaT)的轻量化PQDs识别模型。首先,利用深度可分离卷积初步提取扰动信号的局部特征;其次,提出一种高效的软阈值模块,在不显著增加模型参数量与计算复杂度的同时减少特征中的噪声与冗余特征;然后,利用Transformer模型挖掘PQDs信号的全局特征;最后,通过池化层、线性层和Softmax层完成PQDs识别。仿真实验表明,文中所提CaT模型在参数量和浮点运算数较少的情况下能够有效完成PQDs识别,对PQDs信号识别准确率高,具有良好的噪声鲁棒性。同时,得益于轻量化和端到端的模型设计,CaT模型相对于其他深度学习模型的推理时间更短。

    Abstract:

    A lightweight power quality disturbances (PQDs) recognition model that integrates convolutional neural network (CNN) and Transformer (CaT) is proposed to address the high number of parameters and computational complexity in existing deep learning-based models. Depthwise separable convolutions are first employed to extract local features from the disturbance signals. An efficient softthreshold block is then introduced to reduce noise and redundant features without significantly increasing the model's parameters or complexity. The Transformer model is used to capture global features of the disturbance signals. Finally,pooling layers,fully connected layers,and Softmax are applied to complete the recognition PQDs. Simulation experiments demonstrate that the CaT model effectively recognizes PQDs with fewer parameters and floating point operations,achieving high accuracy and strong noise robustness. Its lightweight,end-to-end design also results in shorter inference times compared to other deep learning models.

    参考文献
    相似文献
    引证文献
引用本文

张彼德,邱杰,娄广鑫,周灿,罗蜻清,李天倩.基于CNN和Transformer的轻量化电能质量扰动识别模型[J].电力工程技术,2025,44(1):69-78

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:2024-09-07
  • 最后修改日期:2024-11-24
  • 录用日期:2024-11-25
  • 在线发布日期: 2025-01-23
  • 出版日期: 2025-01-28
文章二维码