Adaptation of general software testing concepts to neural networks

 
PIIS013234740001214-0-1
DOI10.31857/S013234740001214-0
Publication type Article
Status Published
Authors
Affiliation: OOO Luxoft Professional
Address: Russian Federation
Affiliation: Federal Research Center "Informatics and Management" RAS
Address: Russian Federation
Affiliation: Institute for System Programming them. V.P. Ivannikova RAS
Address: Russian Federation
Journal nameProgrammirovanie
EditionIssue 5
Pages43-56
Abstract

  

Keywords
AcknowledgmentThis work was supported by the Russian Foundation for Basic Research, grant number 18-07-00697_a)
Received26.10.2018
Publication date28.10.2018
Number of characters456
Cite   Download pdf To download PDF you should sign in
Размещенный ниже текст является ознакомительной версией и может не соответствовать печатной

views: 1397

Readers community rating: votes 0

1. Ciresan, D., Meier, U., Masci, J., and Schmidhuber, J. (August 2012). Multi-column deep neural network for traffic sign classification. Neural Networks. Selected Papers from IJCNN 2011, vol. 32, pp. 333–338.

2. CES 2015: Nvidia Demos a Car Computer Trained with “Deep Learning”, A commercial device uses powerful image and information processing to let cars interpret 360? camera views, David Talbot, January 6, 2015, MIT Technology Review; Schmidt.

3. Roth, S. Shrinkage Fields for Effective Image Restoration (PDF). IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 2014.

4. Deng, L. and Yu, D. Deep Learning: Methods and Applications, Foundations and Trends in Signal Processing, 2014, vol. 7, no. 3–4, pp. 1–19.

5. Gao, J., He, X., Yih, S. W-t., and Deng, L. Learning Continuous Phrase Representations for Translation Modeling, 2014, Microsoft Research, www.aclweb.org/anthology/P14-1066.

6. Chicco, D., Sadowski, P., and Baldi, P. Deep autoencoder neural networks for gene ontology annotation predictions, Proceedings of the 5th ACM Conference on Bioinformatics, Computational Biology, and Health Informatics, pp. 533–540.

7. Sathyanarayana, A., Joty, S., Fernandez-Luque, L., Ofli, F., Srivastava, J., Elmagarmid, A.,Arora, T., and Taheri, S. Sleep Quality Prediction From Wearable Data Using Deep Learning. JMIR Mhealth Uhealth, 2016, vol. 4, no. 4, p. 125.

8. Movahedi, F., Coyle, J.L., and Sejdic, E. Deep belief networks for electroencephalography: A review of recent contributions and future outlooks, IEEE J. Biomed Health Inform, 2018, May 22, vol. 3, pp. 642–652.

9. Choi, E., Schuetz, A., Stewart, W.F., and Jimeng, S. Using recurrent neural network models for early detection of heart failure onset, Journal of the American Medical Informatics Association, 2016.

10. Elkahky, A.M., Song, Y., and He, X. A Multi-View Deep Learning Approach for Cross Domain User Modeling in Recommendation Systems. Microsoft Research. http://sonyis.me/paperpdf/ /frp1159-songA-www-2015.pdf

11. Yamins, D.L.K. and Di Carlo, J.J. Using goaldriven deep learning models to understand sensory cortex. Nature Neuroscience, 2016, vol. 19, no. 3, pp. 356–365.

12. Zorzi, M. and Testolin, A. An emergentist perspective on the origin of number sense, Phil. Trans. R. Soc. B, 2018, vol. 373, no. 1740.

13. Morel, D., Singh, C., and Levy, W.B. Linearization of excitatory synaptic integration at no extra cost, Journal of Computational Neuroscience, April 2018, vol. 44, no. 2, pp. 173–188.

14. IEEE 829. Standard for Software Test Documentation. IEEE 1008. Standard for Software Unit Testing. https://www.twirpx.com/file/1615980/

15. ISO/MEhK 12119. Pakety programm. Trebovaniya k kachestvu i testirovaniyu. http://docs.cntd.ru/document/1200025075

16. GOST R 56920-2016, GOST R 56921–2016, GOST R 56922–2016. https://allgosts.ru. 17. ISO/IEC 29119-2013 1-5. Software testing. http://files.stroyinf.ru/Data2/1/4293754/ /4293754866.pdf

17. GOST R 12207-2010, ISO/IEC 12207:2008. http://docs.cntd.ru/document/1200082859 19. Bejzer B. Testirovanie chernogo yaschika. Tekhnologii funktsional'nogo testirovaniya programmnogo obespecheniya i sistem. SPb.: Piter, 2004.

18. Dusting, E., Rashka, J., and Paul, J. Automated Software Testing. Introduction, Management and Performance. Addison Wesley, 1999.

19. Tamres, L. Introducing Software Testing, Addison Wesley, 2002, russkij perevod: Tamre L. Vvedenie v testirovanie programmnogo obespecheniya, M.: Izdatel'skij dom “Vil'yams”, 2003.

20. Kulyamin V.V., Petrenko A.K., Kosachev A.S., Burdonov I.B. Podkhod UniTesK k razrabotke testov // Programmirovanie. 2003. № 6. C. 25–43. 23. Burdonov I.B., Kosachev A.S., Kulyamin V.V. Teoriya sootvetstviya dlya sistem s blokirovkami i razrusheniem. M.: “Fiz-mat lit” Nauka, 2008, 411 s.

21. Ivannikov V.P., Petrenko A.K., Kulyamin V.V., Maksimov A.V. Opyt ispol'zovaniya UniTESK kak zerkalo razvitiya tekhnologij testirovaniya na osnove modelej // Trudy Instituta sistemnogo programmirovaniya RAN. 2013. T. 24. C. 207–218.

22. Kulyamin V.V., Petrenko, A.K. Evolution of the UniTESK test development technology. Programming and Computer Software, 2014, vol. 24, no. 5, pp. 296–304.

23. Yenigun, H., Kushik, N., Lopez, J., Yevtushenko, N., and Cavalli, A.R. Decreasing the complexity of deriving test suites against nondeterministic finite state machines. Proceedings of 2017 IEEE East-West Design and Test Symposium, Proc. of East-West Design & Test Symposium (EWDTS), 2017, IEEE Xplore, pp. 1–4.

24. Beck, K. Test-Driven Development: By Example, Addison-Wesley, 2003. 28. Astels, D. Test-Driven Development. A Practical Guide, Prentice Hall, 2003.

25. Rosenblatt, F. Principles of Neurodynamics: Perceptrons and the Theory of Brain Mechanisms. Spartan Books, Washington DC, 1961.

26. Rumelhart, D.E., Hinton, G.E., and Williams, R.J. Learning Internal Representations by Error Propagation, 1986. https://dl.acm.org/citation.cfm?id=104293.

27. Rumelhart, D.E. and McClelland, J.L., and the PDP research group. (editors), Parallel distributed processing: Explorations in the microstructure of ognition, vol. 1: Foundation. MIT Press, 1986.

28. Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities, Proceedings of the National Academy of Sciences of the USA, April 1982, vol. 79, no. 8, pp. 2554–2558.

29. Ackley, D.H., Hinton, G.E., and Sejnowski, T.J. A Learning Algorithm for Boltzmann Machines. Cognitive Science, 1985, vol. 9, no. 1, pp. 147–169.

30. Kohonen, T. Self-Organized Formation of Topologically Correct Feature Maps, Biological Cybernetics, 1982, vol. 43, no. 1, pp. 59–69.

31. Ivakhnenko A.G., Lapa V.G. Kiberneticheskie predskazyvayuschie ustrojstva. K.:“Nauk. dumka”, 1965.

32. Ivakhnenko, A.G. and Lapa, V.G. Cybernetics and forecasting techniques. New York: Elsevier Publishing Company, Inc., 1967.

33. Fukushima, K. Neocognitron: A self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position. Biol. Cybern., 1980, vol. 36, pp. 193–202.

34. Lecun, Y., Bosser, B., Denker, J.S., Henderson, D., Howard, R.E., Hubbard, W., and Jacket, L.D. Backpropagation Applied to Handwritten Zip Code Recognition, Neural Computation, 1989, vol. 1, no. 4, pp. 541–551.

35. Hinton, G.E., Osindero, S., Teh, Y.W. A Fast Learning Algorithm for Deep Belief Nets. Neural computation, 2006, vol. 18, pp. 1527–1554. http://dx.doi.org/10.1162/neco.2006.18.7.1527

36. Hinton, G.E. Learning multiple layers of representation, Trends in Cognitive Sciences, 2007, vol. 11, no. 10, pp. 428–434.

37. Rumelhart, D.E., Hinton, G.E., and Williams, R.J. Learning internal representations by backpropagating errors. Nature, 1986, vol. 323, pp. 533–536.

38. Floreen, P. Worst-Case Convergence Times for Hop-field Memories, IEEE Trans. Neural Networks, 1991, vol. 2, no. 5, pp. 533–535.

39. Floreen, P. The Convergence of Hamming Memory Networks, IEEE Trans. Neural Networks, 1991, vol. 2, no. 4, pp. 449–457.

40. Utgoff, P.E. and Stracuzzi, D.J. Many-layered learning. Neural Computation, 2002, vol. 14, pp. 2497–2529.

41. Jeffrey, L., Elman, J.L, Bates, E.A., Johnson, M.H., Karmiloff-Smith, A., Parisi, D., and Plunkett, K. Rethinking Innateness: A Connectionist Perspective on Development. Cambridge, MIT Press, 1996.

42. Shrager, J. and Johnson, M.H. Dynamic plasticity influences the emergence of function in a simple cortical array. Neural Networks, 1996, vol. 9, no. 7, pp. 1119–1129.

43. Quartz, S.R. and Sejnowski, T.J. The neural basis of cognitive development: A constructivist ma ifesto. Behavioral and Brain Sciences, 1997, vol. 20, no. 4, pp. 537–556.

44. He, K., Zhang, X., Ren, S., and Sun, J. Identity Mappings in Deep Residual Networks, European Conference on Computer Vision, 2016, pp. 630–645.

45. Ivakhnenko, A. Polynomial theory of complex systems. IEEE Transactions on Systems, Man and Cybernetics, 1971, vol. 4, no. 1, pp. 364–378.

46. Bengio, Y., Boulanger-Lewandowski, N., and Pascanu, R. Advances in optimizing recurrent networks, 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, 2013, pp. 8624–8628. arXiv:1212.0901v2 [cs.LG]

47. Dahl, G., Sainath, T., and Hinton, G. Improving DNNs for LVCSR using rectified linear units and dropout, Proc. of International Conference on Acoustics, Speech and Signal Processing, 2011, pp. 8609–8613.

48. Hinton, G.E., Srivastava, N., Krizhevsky, A., Sutskever, I., and Salakhutdinov, R.R. Improving neural networks by preventing co-adaptation of feature detectors, 2012, arXiv:1207.0580.

49. Hinton, G.E. and Salakhutdinov, R.R. Reducing the Dimensionality of Data with Neural Networks, Science, 2006, vol. 313, no. 5786, pp. 504–507.

50. Kulyamin V.V. Tekhnologii programmirovaniya. Komponentnyj podkhod. M. Internet-universitet informatsionnykh tekhnologij. BINOM. Laboratoriya znanij, 2007.

51. Floreen, P. and Orponen, P. Attraction Radii in Binary Hopfield Nets Are Hard to Compute, Neural Comput., 1993, vol. 5, pp. 812–821.

Система Orphus

Loading...
Up