RECOGNITION OF TEXT PHRASES DISTORTED BY INTERFERENCE BY BACK PROPAGATION NEURAL NETWORK

Authors

  • D. P. Kucherov National Aviation University, Kyiv
  • V. G. Tkachenko National Aviation University, Kyiv
  • I.-F. F. Kashkevych National Aviation University, Kyiv
  • A. O. Androshchuk National Aviation University, Kyiv
  • S. O. Perepelitsyn Institute of Netcentric LLC, Kyiv

DOI:

https://doi.org/10.18372/1990-5548.65.14987

Keywords:

Back propagation neural network, text recognition, recognition probability

Abstract

The paper takes into consideration risk systems that can use not only in nuclear reactions but other plants with frequent risks for people's life, such as mining, and other. Such facilities apply information systems in which take place exchange text messages through free space. The main problem of information radio reception is an increasing number of emitting means that equal the increase of noise level receiving set. As an additional means of processing distorted textual information, it is proposed to use a neural network, which must be pre-configured. For analysis, the back propagation neural network was selected. The adjustment is carried out by an algorithm assuming a double differentiation of the error function, which ensures a high network convergence rate. Learning is stopped according to the total criterion for the deviation of the output signal from the reference. The paper formulates the conditions of quadratic convergence of the back propagation network with one new tuning procedure, and also offers examples of the construction of a neural network for recognizing a text message in various reception conditions. The fed to the neural network is sequence of the letters of English alphabet. A feature of the structure of the neural network that provides correct recognition is the use of completely nonlinear neurons. Comparison of options for the structure of the neural network when recognizing text phrases is carried out according to indicators of the probability of recognition, error, and training time. The established properties of the neural network are useful in the design of efficient information system.

Author Biographies

D. P. Kucherov, National Aviation University, Kyiv

Department of Computerized Control Systems

Doctor of Engineering Science. Senior Research Fellow

orcid.org/0000-0002-4334-4175

V. G. Tkachenko, National Aviation University, Kyiv

Department of Computerized Control Systems

Candidate of Science (Engineering). Associate Professor

orcid.org/0000-0002-1759-7269

I.-F. F. Kashkevych, National Aviation University, Kyiv

Department of Computerized Control Systems

Post-graduate Student

orcid.org/0000-0003-3135-0751

A. O. Androshchuk, National Aviation University, Kyiv

Student

orcid.org/0000-0002-0156-9691

S. O. Perepelitsyn, Institute of Netcentric LLC, Kyiv

Head of Research Institution

orcid.org/000-0002-8435-2729

References

Y. Shi, J. Chen, J. Hao, J. Bi, M. Qi and X. Wang, “Statistical Analysis of Coal Mine Accidents of China in 2018,” in Proc. 2019 Prognostics and System Health Management Conference (PHM-Qingdao), 2019, pp. 1–6. DOI: 10.1109/PHM-Qingdao46334.2019.8942991

R. K. Ur and G. Heo, “Risk Informed Optimization of Instrumentation and Control (I&C) Architecture,” in Proc. 2017 International Conference on Innovations in Electrical Engineering and Computational Technologies (ICIEECT), 2017, pp. 1–5. DOI: 10.1109/ICIEECT.2017.7916585

J. Shi and G. Wang, “Risk-Informed Periodic Surveillance Testing Interval of Digital Safety Systems with Self-Diagnosis Capacity,” in Proc. 2014 10th International Conference on Reliability, Maintainability and Safety (ICRMS), 2014, pp. 1156–1160. DOI: 10.1109/ICRMS.2014.7107385

C. Yao, X. Bai, B. Shi and W. Liu, “Strokelets: A Learned Multi-Scale Representation for Scene Text Recognition,” in Proc. Conference on Computer Vision and Pattern Recognition, 2014, pp. 4042–4049. DOI: 10.1109/CVPR.2014.515

Q. Yang, H. Jin, J. Huang and W. Lin, (2020, Mar). SwapText: Image Based Texts Transfer in Scenes. Presented at CVPR 2020. [Online]. Available: http://arxiv.org/abs/2003.08152v1

A. Oord, N. Kalchbrenner and K. Kavukcuoglu, (2016, Aug). Pixel Recurrent Neural Networks. [Online]. Available: https://arxiv.org/abs/1601.06759

S. Lapuschkin, A. Binder, G. Montavon, K.-R. Muller and W. Samek, (2016, Jun). “The LRP Toolbox for Artificial Neural Networks,” Journal of Machine Learning Research. [Online]. 17(114), pp. 1–5. Available: http://publica.fraunhofer.de/documents/N-421912.html last accessed 2020/03/2

J. Schmidhuber, “Deep Learning in Neural Networks: An Overview,” Neural Networks, vol. 61, pp. 85–117, Jan 2015. DOI: 10.1016/J.NEUNET.2014.09.003

R. Vargas and L. Ruiz, (2017, Nov). Deep Learning: Previous and Present Applications. Journal of Awareness. [Online]. 2(3), pp. 11–20. Available: https://journals.gen.tr/index.php/joa/article/view/306/258

E. Cengil, A. Çınar and E. Özbay, “Image Classification with Caffe Deep Learning Framework,” in Proc. International Conference on Computer Science and Engineering, 2017, рp. 440–445. DOI: 10.1109/UBMK.2017.8093433

G.E. Hinton, S. Osindero and Y.-W. The, (2006, Aug). A Fast Learning Algorithm for Deep Belief Nets. Neural Computation. 18(7), pp. 1527–1554. DOI: 10.1162/neco.2006.18.7.1527

L. Deng and D. Yu, “Deep Learning: Methods and Applications,” Foundations and Trends in Signal Processing, vol. 7, no. 3–4, pp. 197-387, Jun. 2014. DOI: 10.1561/2000000039

D. P. Kucherov, I. V. Ohirko, O. I Ohirko and T. I. Golenkovskaya, “Neural Network Technologies for Recognition Characters,” Electronic and Control System, vol. 4, no. 46, pp. 65-71, Dec. 2005. DOI: 10.18372/1990-5548.46.9967

D. P. Kucherov, V. N. Dikhtyarenko and A. M. Kozub, (2016, Feb). “An Algorithm of Setting the Weights of a Neural Network Controller,” Journal of Automation and Control Engineering, 4(1), pp. 1–6. DOI: 10.12720/joace.4.1.1-7

H. P. Gavin, (2019, Aug.) The Levenberg-Marquardt Algorithm for Nonlinear Least Squares Curve-Fitting Problems. Dep. of Civil and Envir. Engin., Duke University, Darem. [Online]. Available: http://people.duke.edu/~hpgavin/ce281/lm.pdf

M. K. Transtrum and J. P. Sethna, (2012, Jan.) Improvements to the Levenberg-Marquardt Algorithm for Nonlinear Least-Squares Minimization. Lab. of Atomic and Solid State Physics, Cornell University, Ithaca. [Online]. Available: https://arxiv.org/pdf/1201.5885.pdf

W. Zhang, “An Extended Adaline Neural Network Trained by Levenberg-Marquardt Method for System Identification of Linear Systems,” in Proc. 25th Chinese Control and Decision Conference (CCDC), 2013, pp. 2453–2458. DOI:10.1109/CCDC.2013.6561351

X. Li, S. Liu, H. Li, J. Wang and T. Zhang, “Research on Prediction Model of Bending Force Based on BP Neural Network with LM Algorithm,” in Proc. 25th Chinese Control and Decision Conference (CCDC), 2013, pp. 832–836. DOI:10.1109/CCDC.2013.6561037

J. Bilski, B. Kowalczyk, and K. Grzanek, “The Parallel Modification to the Levenberg-Marquardt Algorithm,” in Proc. International Conference on Artificial Intelligence and Soft Computing (ICAISC-2018), 2018, pp. 15–24. DOI:10.1007/978-3-319-91253-0

S. Haykin, Neural Networks: A Comprehensive Foundation. Singapore: India, 2005. ISBN-13: 978-0-13-147139-9 ISBN-10: 0-13-147139-2 1

B. T. Polyak, “Newton's Method and its Role in Optimization and Computational Mathematics,” European Journal of Operational Research. 181(3), pp. 1086–1096, 2007, Sep. DOI:10.1016/j.ejor.2005.06.076

E. P. Gill, W. Murray and M. H. Wright, Practical optimization. San Diego, CA: GB, 1981.

Downloads

Issue

Section

AUTOMATION AND COMPUTER-INTEGRATED TECHNOLOGIES