Neurocomputing, ( ISI ), Volume (461), Year (2021-10) , Pages (86-98)

Title : ( Improving the backpropagation algorithm with consequentialism weight updates over mini-batches )

Authors: Naeem Paeedeh , Kamaledin Ghiasi Shirazi ,

Access to full-text not allowed by authors

Citation: BibTeX | EndNote

Abstract

Normalized least mean squares (NLMS) and affine projection algorithm (APA) are two successful algorithms that improve the stability of least mean squares (LMS) by reducing the necessity to change the learning rate during the training process. In this paper, we extend them to multi-layer neural networks. We first prove that it is possible to consider a multi-layer neural network as a stack of adaptive filters. It opens the door to bring successful algorithms from adaptive filters to neural networks. We additionally introduce a more comprehensible interpretation than the complicated geometric interpretation in APA for a single fully-connected (FC) layer that can easily be generalized, for instance, to convolutional neural networks and mini-batch training. With this new viewpoint, we introduce a more robust algorithm by predicting and then amending the adverse consequences of some actions that take place in mini-batch backpropagation (BP), even before they happen. The proposed method is a modification to the BP that can be used alongside stochastic gradient descent (SGD) and its momentum variants like Adam and Nesterov. Our experiments show the usefulness of the proposed method in the training of deep neural networks. It is less sensitive to hyper-parameters and needs less intervention during the training process. Besides, it usually converges more smoothly and in fewer iterations. Such predictable behavior helps it to get tuned easier, be resilient during the training, and reduce or eliminate its reliance on other techniques such as momentum.

Keywords

, Adaptive filters, LMS, NLMS, APA, Backpropagation, SGD
برای دانلود از شناسه و رمز عبور پرتال پویا استفاده کنید.

@article{paperid:1085794,
author = {Naeem Paeedeh and Ghiasi Shirazi, Kamaledin},
title = {Improving the backpropagation algorithm with consequentialism weight updates over mini-batches},
journal = {Neurocomputing},
year = {2021},
volume = {461},
month = {October},
issn = {0925-2312},
pages = {86--98},
numpages = {12},
keywords = {Adaptive filters; LMS; NLMS; APA; Backpropagation; SGD},
}

[Download]

%0 Journal Article
%T Improving the backpropagation algorithm with consequentialism weight updates over mini-batches
%A Naeem Paeedeh
%A Ghiasi Shirazi, Kamaledin
%J Neurocomputing
%@ 0925-2312
%D 2021

[Download]