Open Access Open Access  Restricted Access Subscription or Fee Access

An Overview of Artificial Neural Networks: Part 3 Activation Functions

R. B. Dhumale

Abstract


This paper presents the concepts of Activation Functions in Artificial Neural Networks (ANNs). The activation function is essential in ANN to map relationship between input and output.  The nonlinear property between input and output in ANN is added by the Nonlinear Activation Functions. The different types of Threshold, Linear and Nonlinear activation function can be used in ANN. This paper aims to explain concept of Nonlinear Activation Functions i.e. Sigmoidal Activation Function and Hyperbolic Tangent Activation Function. How to select Activation Functions in ANN is discussed. The effect of type of Activation Function and slope of activation function on Mean Square Error is analyzed with the help of nonlinear input and output data. The algorithm and MATLAB Code is also added to help researcher.


Keywords


Activation Functions, Threshold, Linear, Sigmoid, Hyperbolic Tangent.

Full Text:

PDF

References


Helmut A. Mayer, “A Taxonomy of the Evolution of Artificial Neural Systems”, Department of Scientific Computing, University of Salzburg, AUSTRIA, pp 1-15.

T. Kovzoglu, P. M. Mather, “The use of back propagating artificial neural networks in land cover classification”, International Journal of Remote Sensing , Volume 24, 2003 - Issue 23, pp 4907-4938..

Dowling, David R.; Filipi, Zoran, “Sample records for small dimensionless parameter”, U. S. Department of Energy, 1996, Volume: 1, Issue: 01.

Tarek Shob, Khaled Elleithy, “Innovations in Computing Sciences and Software Engineering”, Springer Science and Business Media B.V. 2010.

Viv Bewick, Liz Cheek, Jonathan Ball, “Correlation and regression”, BioMed Central, Volume: 7, Issue 6, 2003 Nov 5, 451–459.

Jurgen Schmidhuber, “Deep Learning: An Overview”, arxiv, 8 Oct 2014.

Peter Tino, Lubica Benuskova, Alessandro Sperduti, “Artificial Neural Network Model”, Springer Handbook of Computational Intelligence, 2015, pp-455-471.

Wolfgang Maass, “Energy-efficient neural network chips approach human recognition capabilities”, Proc Natl Acad Sci U S A.Volume - 113(41), pp -11387–11389, 2016 Oct 11

Miguel Tome, “Artificial Computation in Biology and Medicine: Part 1”.

Wolfgang Kliemann, S. Namachchivaya ,”Nonlinear Dynamics and Stochastic Mechanics”

F. Thompson, Rachel A. Kuske , Adam H. Monahan , “Stochastic averaging of dynamical systems with multiple time scales forced with α-stable noise”, arxiv, 8 Oct 2014.

Dr. K.Vijayarekha, “Activation Functions”, School of Electrical and Electronics Engineering SASTRA University, pp 1-3.

R. B. Dhumale, M. P. Ghatule, N. D. Thombare, P. M. Bangare, “An Overview of Artificial Neural Networks : Part 1”, Ciit International Journal of Artificial Intelligent Systems and Machine Learning, Feb 2018, Vol. 10, No. 2, (Accepted).

R. B. Dhumale, S. D. Lokhande, “Neural Network Fault Diagnosis of Voltage Source Inverter under Variable Load Conditions at Different Frequencies”, Measurement, vol. 91, pp- 565–575, Sept. 2016.

Parveen Sehgal1, Dr. Sangeeta Gupta2 , Prof. Dharminder Kumar3,”Minimization of Error in Training a Neural Network Using Gradient Descent Method”, International Journal of Technical Research(IJTR) Vol 1,Issue 1,Mar-Apr 2012, pp 1-3.


Refbacks

  • There are currently no refbacks.