Multilayer Artificial Neural Networks with S-Parabola Activation Function and their Applications
- Authors: Khachumov M.V.1,2,3, Emelyanova Y.G.1
-
Affiliations:
- A. K. Ailamazyan Program Systems Institute of Russian Academy of Sciences
- Computer Science and Control Federal Research Center of Russian Academy of Sciences
- RUDN University
- Issue: No 3 (2024)
- Pages: 42-53
- Section: Machine Learning, Neural Networks
- URL: https://journal-vniispk.ru/2071-8594/article/view/265358
- DOI: https://doi.org/10.14357/20718594240304
- EDN: https://elibrary.ru/KINYVP
- ID: 265358
Cite item
Full Text
Abstract
An analysis of modern work in the field of building fast-acting neurons and neural networks was carried out. The algorithm for setting up a multilayer neural network of direct propagation with the activation function of the type "s-parabola" is presented. The setting was carried out based on the method of reverse error propagation, adapted for the specified new function. Examples of using s-parabola in artificial neural networks for solving problems of time series recognition and prediction are considered. Recognition was carried out on the example of typical domestic aircraft, where the objects overall dimensions and the invariant moments of their profiles were used as signs. To predict the time series, the readings of one of the small spacecraft sensors were applied. The solutions quality obtained by the proposed approach was compared with solutions based on neural networks with a traditional "sigmoid". The s-parabola advantage in terms of learning speed and subsequent solution of the applied problem is shown.
About the authors
Mikhail V. Khachumov
A. K. Ailamazyan Program Systems Institute of Russian Academy of Sciences; Computer Science and Control Federal Research Center of Russian Academy of Sciences; RUDN University
Author for correspondence.
Email: khmike@inbox.ru
Candidate of Physical and Mathematical Sciences, Senior Researcher; Senior Researcher; Associate Professor
Russian Federation, Pereslavl-Zalessky; Moscow; MoscowYulia G. Emelyanova
A. K. Ailamazyan Program Systems Institute of Russian Academy of Sciences
Email: yuliya.emelyanowa2015@yandex.ru
Candidate of Technical Sciences, Senior Researcher
Russian Federation, Pereslavl-ZalesskyReferences
- Lavin A., Gray S. Fast Algorithms for Convolutional Neural Networks CVPR // IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 2016. P. 4013-4021. doi: 10.1109/CVPR.2016.435.
- Valueva M.V., Lyahov P.A., Nagornov N.N., Valuev G.V. Vysokoproizvoditel'nye arhitektury cifrovoj fil'tracii izobrazhenij v sisteme ostatochnyh klassov na osnove metoda Vinograda [High-performance digital image filtering architectures in the residue number system based on the Winograd method] // Komp'yuternaya optika [Computer optics]. 2022. V. 46. No 5. Т. 46. P. 752-762. doi: 10.18287/2412-6179-CO-933.
- Lebedev V. Algorithms for speeding up convolutional neural networks: Doctoral Thesis. Skolkovo Institute of Science and Technology. Doctoral program in computational and data science and engineering. Moscow, 2018. P. 106.
- Arce F., Zamora E., Humberto S., Barrón R. Differential evolution training algorithm for dendrite morphological neural networks // Applied Soft Computing. 2018. V. 68. P. 303-313.
- Dimitriadis N., Maragos P. Advances in the training, pruning and enforcement of shape constraints of morphological neural networks using tropical algebra // IEEE International Conference On Acoustics, Speech And Signal Processing. 2021. P. 3825-3829.
- Limonova E.E., Nikolaev D.P., Alfonso D.M., Arlazarov V.V. Bipolar morphological neural networks: gate-efficient architecture for computer vision // IEEE Access. 2021. V. 9. P. 97569-97581.
- Dubey Sh.R., Singh S.K., Chaudhuri B.B. Activation Functions in Deep Learning: A Comprehensive Survey and Benchmark // Neurocomputing. 2022. V. 503. No 11. P. 1-18. doi: 10.1016/j.neucom.2022.06.111.
- Feng J., Lu Sh. Performance Analysis of Various Activation Functions in Artificial Neural Networks // Journal of Physics Conference Series. 2019. P. 1-7. doi: 10.1088/1742-6596/1237/2/02203.
- Khachumov M.V, Emelyanova Yu.G. Parabola kak funktsiya aktivatsii iskustvennykh neyronnykh setey [Parabola as an activation function of artificial neural networks] // Iskusstvenniy intellekt i prinyatie resheniy [Artificial Intelligence and Decision Making]. 2023. No 2. P. 89-97. doi: 10.14357/20718594230207 (RSCI).
- Khachumov M.V, Emelyanova Yu.G, Emelyanov M.A., Khachumov V.M. Logicheskiy basis na neyronakh s parabolicheskoy funktsiey aktivatsii [Parabola-based artificial neural network activation functions] // Materialy Dvadtsat’ tret’ey Mezhdunarodnoy konferentsii po Vychislitel’noy mekhanike i sovremennym prikladnym programmnym sistemam (VMSPPS’2023) [Materials of the Twenty-third International Conference on Computational Mechanics and Modern Application Software Systems (CMMASS’2023). 2023. P. 144-146.
- Sosnin A.S., Suslova I.A. Neural network activation functions: sigmoid, linear, step, relu, than // Science. Information support. Technology. Education. The Proceedings of XII international research and practice conference. 2019. P. 237-246.
- Voennaya aviatsiya Rossii: obzor boevykh samolyotov [Russian Military Aviation: Combat Aircraft Review] // Electronic resource. URL: https://pilotgid.ru/samolety/voennye-samolety-rossii.html (accessed 05.10.2023).
- Nguen D.Th. Invarianty v zadachah raspoznavaniya graficheskih obrazov [Invariant in the pattern recognition] // Vesntik RUDN. Seriya Matematika. Informatika. Fizika [RUDN Journal of Mathematics, Information Sciences and Physics]. 2016. No 1. P. 76-85.
Supplementary files
