Open Access Open Access  Restricted Access Subscription or Fee Access

Survey on Neural Networks, Algorithm, Types and its Application

M. Anusha, C.B. Selvalakshmi, N. Nandini

Abstract


Neural network is a system of programs and data structures that approximates the operation of the human brain. A neural network usually involves a large number of processors operating in parallel, each with its own small sphere of knowledge and access to data in its local memory Neural network is initially trained or fed large amounts of data and rules about data relationships. Neural networks use several principles like gradient-based training, fuzzy logic, genetic algorithms, and Bayesian methods. Neural networks are described in terms of knowledge layers, with general, more complex networks having deeper layers. Neural networks are used in forecasting applications and business classification applications due to their ability to “learn” from the data, their nonparametric nature and their ability to generalize. The comparative study on the algorithm is done in this paper.


Keywords


Artificial neural Network, Neurons, Networks, Perceptron

Full Text:

PDF

References


Akinyokun, O. C. (2002): Neuro – fuzzy Expert system for Evaluation of Human Resources Performance. First Bank of Nigeria PLC Endowment Fund Lecture, The Federal University of Technology Akure, Nigeria.

Bezdek, J.C. (1993): Fuzzy Models: What are they and Why? IEEE Transactions on Fuzzy Systems, Vol. 1, No.pp 1-6.

Bishop, C. M. (1995): Neural Networks for Pattern Recognition. Oxford University Press.

Burgess, A.N. and Refenes, A.N. (1996): Modeling Non-linear Co-integration in International quity Index Futures, in Refenes et al (eds), Neural Networks in Financial Engineering, World Scientific, Singapore, 50-63.

Brown, A. (1991): Nerve Cells and Nervous Systems, Springer-Verlag, Berlin.

Caudill, M. and Butler, C. (1996): Naturally Intelligent System. Massachusetts Institute of Technology.

Crick, F. (1994): The Astonishing Hypothesis – The Scientific Search for the Soul, Charles Scribner’s Sons,New York.

De Castro, L. N. and Timmis, J. (2002b): “An Artificial Immune Network for Multimodal Function Optimization”, Proc. of the CEC/WCCI, 1, pp. 699-704.

Haykin S. (1999): Neural Networks – A Comprehensive Foundation, Prentice Hall, 2nd Ed S. Haykin, “Neural Networks”, Prentice-Hall, 1999, chapter 3.

L. Fausett, “Fundamentals of Neural Networks”, Prentice-Hall, 1994, Chapter 2. R. O. Duda, P.E. Hart, and D.G. Stork,“Pattern Classification”, 2ndedition, Wiley 2001. Appendix A4, chapter 2, and chapter 5.

J.M. Zurada, “Introduction to Artificial Neural Systems”, West Publishing Company, 1992, chapter 3.

Collins, M., and Du_y, N. (2001). Convolution Kernels for Natural Language. In Proceedings of Neural Information Processing Systems (NIPS 14).

Collins, M., and Du_y, N. (2002). New Ranking Algorithms for Parsing and Tagging: Kernels over Discrete Structures, and the Voted Perceptron. In Proceedings of ACL 2002.

Collins, M. (2002). Ranking Algorithms for Named{ Entity Extraction: Boosting and the Voted Perceptron. In Proceedings of ACL 2002.

Freund, Y. & Schapire, R. (1999). Large Margin Classification using the Perceptron Algorithm. In Machine Learning, 37(3):277{296.Helmbold, D., andWarmuth, M. On weak learning. Journal of Computer and System Sciences, 50(3):551-573,June 1995.

Laerty, J., McCallum, A., and Pereira, F. (2001). Conditional random _elds: Probabilistic models for segmenting and labeling sequence data. In Proceedings of ICML 2001.

McCallum, A., Freitag, D., and Pereira, F. (2000) Maximum entropy markov models for information extraction and segmentation. In Proceedings of ICML 2000.




DOI: http://dx.doi.org/10.36039/AA022014002.

Refbacks

  • There are currently no refbacks.


Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.