Parabola as an Activation Function of Artificial Neural Networks
- Authors: Khachumov M.V.1,2,3, Emelyanova Y.G.1
-
Affiliations:
- Ailamazyan Program Systems Institute of the Russian Academy of Sciences
- Computer Science and Control Federal Research Center of the Russian Academy of Sciences
- Peoples' Friendship University of Russia
- Issue: No 2 (2023)
- Pages: 89-97
- Section: Machine Learning, Neural Networks
- URL: https://journal-vniispk.ru/2071-8594/article/view/269436
- DOI: https://doi.org/10.14357/20718594230207
- ID: 269436
Cite item
Full Text
Abstract
The use of parabola and its branches as a nonlinearity expanding the logical capabilities of artificial neurons is considered. In particular, the applicability of parabola branches for constructing an s-shaped function suitable for tuning a neural network by reverse error propagation is determined. The implementation of the XOR function on two and three neurons using the proposed approach is demonstrated. The main advantage of the parabola over the sigmoid is a simpler implementation, which speeds up the work of artificial neural networks.
About the authors
Mikhail V. Khachumov
Ailamazyan Program Systems Institute of the Russian Academy of Sciences; Computer Science and Control Federal Research Center of the Russian Academy of Sciences; Peoples' Friendship University of Russia
Author for correspondence.
Email: khmike@inbox.ru
Candidate of Physical and Mathematical Sciences, Senior Researcher; Senior Researcher; Associate Professor
Russian Federation, Veskovo; Moscow; MoscowYulia G. Emelyanova
Ailamazyan Program Systems Institute of the Russian Academy of Sciences
Email: yuliya.emelyanowa2015@yandex.ru
Candidate of Technical Sciences, Researcher
Russian Federation, VeskovoReferences
- Zakharov A.V., Khachumov V.M. Bit-parallel Representation of Activation Functions for Fast Neural Networks // Proceedings of the 7-th International Conference on Pattern Recognition and Image Analysis. 2014. V. 2. P. 568-571.
- Arce F., Zamora E., Humberto S. Barrón, R. Differential evolution training algorithm for dendrite morphological neural networks // Applied Soft Computing. 2018. V. 68. P. 303-313.
- Dimitriadis N., Maragos, P. Advances in the training, pruning and enforcement of shape constraints of Morphological Neural Networks using Tropical Algebra // IEEE International Conference On Acoustics, Speech And Signal Processing. 2021. P. 3825-3829.
- Limonova E.E., Nikolaev D.P., Alfonso D.M., Arlazarov V.V. Bipolar Morphological Neural Networks: Gate-Efficient Architecture for Computer Vision // IEEE Access. 2021. V.9. P. 97569-97581.
- Limonova E.E., Nikolaev D.P., Arlazarov V.V. Bipolar Morphological U-Net for Document Binarization // Thirteenth International Conference on Machine Vision. 2021. P. 1-9.
- Limonova E.E., Nikolaev D.P, Alfonso D., Arlazarov V.V. ResNet-like Architecture with Low Hardware Requirements // 25th International Conference on Pattern Recognition. 2021. P. 6204-6211.
- Limonova E., Matveev D., Nikolaev D., Arlazarov V. Bipolar morphological neural networks: convolution without multiplication // Twelfth International Conference on Machine Vision. 2020. V. 11433. P. 1-18.
- Khachumov V.M. Logicheskie elementy na neyronah [Logical elements on neurons] // Trudy IX Mezhdunarodnoy Conferentcyi “Intellectualnye systemy i comp’uternye nauki” [Proceedings of the 9th International Conference “Intelligent Systems and Computer Science”]. Moscow, 2006. V. 1. P. 297-300.
- Kruflov V.V., Borisov V.V. Iskusstvennye neuronnye seti. Teoriya i practika [Artificial neural networks. Theory and practice]. Moscow: Hotline-Telecom, 2002.
- Haykin S. Neyronnye seti: Polny’ kurs [Neural Networks: The Full Course]. Мoscow: Williams. 2006.
- Callan R. Osnovnye koncepcii neyronnykh setej [The Essence of Neural Networks]. Мoscow: Williams. 2001.
Supplementary files
