Two-Point Step Size Gradient Method for Solving a Deep Learning Problem


如何引用文章

全文:

开放存取 开放存取
受限制的访问 ##reader.subscriptionAccessGranted##
受限制的访问 订阅存取

详细

This paper is devoted to an analysis of the rate of deep belief learning by multilayer neural networks. In designing neural networks, many authors have applied the mean field approximation (MFA) to establish that the state of neurons in hidden layers is active. To study the convergence of the MFAs, we transform the original problem to a minimization one. The object of investigation is the Barzilai–Borwein method for solving the obtained optimization problem. The essence of the two-point step size gradient method is its variable steplength. The appropriate steplength depends on the objective functional. Original steplengths are obtained and compared with the classical steplength. Sufficient conditions for existence and uniqueness of the weak solution are established. A rigorous proof of the convergence theorem is presented. Various tests with different kinds of weight matrices are discussed.

作者简介

T. Todorov

Department of Mathematics and Informatics, Technical University

编辑信件的主要联系方式.
Email: t.todorov@yahoo.com
保加利亚, Gabrovo

G. Tsanev

Department of Computer Systems and Technology, Technical University

Email: t.todorov@yahoo.com
保加利亚, Gabrovo

补充文件

附件文件
动作
1. JATS XML

版权所有 © Springer Science+Business Media, LLC, part of Springer Nature, 2019