THE DEEP NEURAL NETWORK-A REVIEW

  • Eman Jawad Assistant. Lecher/AL–Furat Al-Awsat Technical University/Iraq.
Keywords: Neural Networks, Deep Neural Networks, Relu Function, Optimization

Abstract

Deep neural networks are considered the backbone of artificial intelligence, we will present a review of an article about the importance of neural networks and their role in other sciences, their characteristic, networks architecture, types, mathematical definition of deep neural networks, as well as their applications.

Downloads

Download data is not yet available.

References

[1] Adler.j and Oktem.o, (2017), Solving ill-posed inverse problems using iterative deep neural networks. Inverse ¨ Probl. 33, 124007
[2].Anderson, J. A. (2003). An Introduction to neural networks. Prentice Hall.
[3] Andrade-Loarca.H, Kutyniok.G,Oktem .O, and Petersen.P, (2019), Extraction of digital wavefront sets using ¨ applied harmonic analysis and deep neural networks. SIAM J. Imaging Sci. 12, 1936–1966.
[4] Andrade-Loarca.H, Kutyniok.G, Oktem.O, and Petersen.P , (2021), Deep Microlocal Reconstruction for Limited- ¨ Angle Tomography, arXiv:2108.05732.
[5] Bach.S, Binder.A, Montavon.G, Klauschen.F, M¨uller.K.R, and Samek.W, (2015), On pixel-wise explanations for non-linear classifier decisions by layer-wise relevance propagation. PLoS ONE 10 ,e0130140.
[6] Belkin.M, Hsu.D, Ma.S, and Mandal.S, (2019), Reconciling modern machine-learning practice and the classical bias–variance trade-off. Proc. Natl. Acad. Sci. USA 116 ,15849–15854.
[7] Berner.J, Grohs.P, Kutyniok.G, and Petersen.P,(2021), The Modern Mathematics of Deep Learning. In: Mathematical Aspects of Deep Learning, Cambridge University Press, to appear.
[8] ¨olcskei H.B, Grohs.P, Kutyniok .G, and Petersen.P, (2019), Optimal Approximation with Sparsely Connected Deep Neural Networks. SIAM J. Math. Data Sci. 1 ,8–45.
[9]Cheng, B. and Titterington, D. M. (1994). Neural networks: A review from a statistical perspective. Statistical Science, 9, 2-54
[10] Cybenko.G (1989), Approximation by superpositions of a sigmoidal function. Math. Control Signal 2 (, 303– 314.
[11]Dewolf, E.D., and Francl, L.J., (1997). Neural networks that distinguish in period of wheat tan spot in an outdoor environment Phytopathalogy, 87, 83-87.
[12]Dewolf, E.D. and Francl, L.J. (2000) Neural network classification of tan spot and stagonespore blotch infection period in wheat field environment. Phytopathalogy, 20-,108-113.
[13] D Donoho.D, (2001), Sparse components of images and optimal atomic decompositions. Constr. Approx. 17 ,353–382.
[14] E.W and B. Yu. (2018), The deep ritz method: a deep learning-based numerical algorithm for solving variational problems. Commun. Math. Stat. 6 ,1–12.
[15]Gaudart, J. Giusiano, B. and Huiart, L. (2004). Comparison of the performance of multi-layer perceptron and linear regression for epidemiological data. Comput. Statist. & Data Anal., 44, 547-70.
[16]Hassoun, M. H. (1995). Fundamentals of Artificial Neural Networks. Cambridge: MIT Press.
[17]Hopfield, J.J. (1982). Neural network and physical system with emergent collective computational capabilities. In proceeding of the National Academy of Science(USA79), 2554-2558
[18]Kaastra, I. and Boyd, M.(1996). Designing a neural network for forecasting financial and economic time series. Neurocomputing, 10, 215-236.
[19]Kohzadi, N., Boyd, S.M., Kermanshahi, B. and Kaastra, I. (1996). A comparision of artificial neural network and time series models for forecasting commodity prices Neurocomputing, 10, 169-181.
[20]Kumar, M., Raghuwanshi, N. S., Singh, R,. Wallender, W. W. and Pruitt, W. O. (2002)Estimating Evapotranspiration using Artificial Neural Network. Journal of Irrigation and Drainage Engineering, 128, 224-233
[21]Masters, T. (1993). Practical neural network recipes in C++, Academic press, NewYork.
[22]Mcculloch, W.S. and Pitts, W. (1943) A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophy., 5, 115-133
[23]Pal, S. Das, J. Sengupta, P. and Banerjee, S. K. (2002). Short term prediction of atmospheric temperature using neural networks. Mausam, 53, 471-80
[24]Patterson, D. (1996). Artificial Neural Networks. Singapore: Prentice Hall.
[25]Rosenblatt, F. (1958). The perceptron: A probabilistic model for information storage ang organization in the brain. Psychological review, 65, 386-408.
[26]Rumelhart, D.E., Hinton, G.E and Williams, R.J. (1986). “Learning internal representation by error propagation”, in Parallel distributed processing: Exploration in microstructure of cognition, Vol. (1) ( D.E. Rumelhart, J.L. McClelland and the PDP research gropus, edn.) Cambridge, MA: MIT Press, 318-362.
[27]Saanzogni, Louis and Kerr, Don (2001) Milk production estimate using feed forward artificial neural networks. Computer and Electronics in Agriculture, 32, 21-30.
[28]Warner, B. and Misra, M. (1996). Understanding neural networks as statistical tools American Statistician, 50, 284-93.
[29]Yegnanarayana, B. (1999). Artificial Neural Networks. Prentice Hall .
[30]Zhang, G., Patuwo, B. E. and Hu, M. Y. (1998). Forecasting with artificial neural networks: The state of the art. International Journal of Forecasting, 14, 35-62.
Published
2023-09-23
How to Cite
Jawad, E. (2023). THE DEEP NEURAL NETWORK-A REVIEW. IJRDO -JOURNAL OF MATHEMATICS, 9(9), 1-5. https://doi.org/10.53555/m.v9i9.5842