基于ArcReLU函数的神经网络激活函数优化研究
作者:
作者单位:

上海电力学院计算机科学与技术学院,上海,200090

作者简介:

通讯作者:

基金项目:

国家自然科学基金 61272437,61305094;上海市教育发展基金会和上海市教育委员会“晨光计划” 13CG58国家自然科学基金(61272437,61305094)资助项目;上海市教育发展基金会和上海市教育委员会“晨光计划”(13CG58)资助项目。


Optimization of Activation Function in Neural Network Based on ArcReLU Function
Author:
Affiliation:

College of Computer Science and Technology, Shanghai University of Electric Power, Shanghai, 200090, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
    摘要:

    近年来深度学习发展迅猛。由于深度学习的概念源于神经网络,而激活函数更是神经网络模型在学习理解非线性函数时不可或缺的部分,因此本文对常用的激活函数进行了研究比较。针对常用的激活函数在反向传播神经网络中具有收敛速度较慢、存在局部极小或梯度消失的问题,将Sigmoid系和ReLU系激活函数进行了对比,分别讨论了其性能,详细分析了几类常用激活函数的优点及不足,并通过研究Arctan函数在神经网络中应用的可能性,结合ReLU函数,提出了一种新型的激活函数ArcReLU。实验证明,该函数既能显著加快反向传播神经网络的训练速度,又能有效降低训练误差并避免梯度消失的问题。

    Abstract:

    Deep learning has developed rapidly in recent years. The concept of deep learning originates from the neural networks. And the activation function is an indispensable part of the neural network model in learning to understand non-linear functions. Therefore, the common activation functions are studied and compared, aiming at the problems of slow convergence speed, local minimum or gradient disappearance of the commonly used activation functions in back propagation neural networks. In this paper, the Sigmoid and ReLU activation functions are compared, their performances are discussed respectively, and the advantages and disadvantages of several common activation functions are analyzed in detail. Finally, a new activation function, ArcReLU, is proposed by studying the possibility of applying Arctan functions in neural networks and combining with ReLU functions. Experiments show that the function can not only significantly accelerate the training speed of BP neural network, but also effectively reduce the training error and avoid the problem of gradient disappearance.

    参考文献
    相似文献
    引证文献
引用本文

许赟杰,徐菲菲.基于ArcReLU函数的神经网络激活函数优化研究[J].数据采集与处理,2019,34(3):517-529

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
历史
  • 收稿日期:2018-05-23
  • 最后修改日期:2019-04-08
  • 录用日期:
  • 在线发布日期: 2019-06-12