views: 1306
Readers community rating: votes 0
1. Ajvazyan S.A., Mkhitaryan V.S. Prikladnaya statistika. Klassifikatsiya i snizhenie razmernosti. M.: Finansy i statistika, 1989.
2. Merkov A.B. Raspoznavanie obrazov. Vvedenie v metody statisticheskogo obucheniya. M.: URSS, 2010.
3. Friedman J., Hastie T., Tibshirani R. The elements of statistical learning. Springer series in statistics Springer. Berlin, 2001. V. 1.
4. Popkov Yu.S., Dubnov Yu.A., Popkov A.Yu. Randomized Machine Learning: Statement, Solution, Applications // Intelligent Systems (IS). 2016 IEEE 8th Int. Conf. 2016. P. 27–39.
5. Bruckstein A.M., Donoho D.L., Elad M. From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images // SIAM Rev. 2009. V. 51. No. 1. P. 34–81.
6. Kendall M., St'yuart A., Gal'chuk L.I., Terekhin A.T. Statisticheskie vyvody i svyazi. M.: Nauka. Gl. red. fiz.-mat. lit., 1973.
7. Jolliffe I.T. Principal component analysis. N.Y.: Springer-Verlag, 1986.
8. Comon P., Jutten C. Handbook of Blind Source Separation. Independent Component Analysis and Applications. Oxford: Academic Press, 2010.
9. Berry M.W., Browne M. Algorithms and Applications for Approximate Nonnegative Matrix Factorization // Comput. Statist. Data Analysis. 2007. V. 52. P. 155–173.
10. Polyak B.T., Khlebnikov M.V. Metod glavnykh komponent: robastnye versii // AiT. 2017. № 3. S. 130–148. Polyak B.T., Khlebnikov M.V. Principle Component Analysis: Robust Versions // Autom. Remote Control. 2017. V. 78. No. 3. P. 490–506.
11. Johnson W.B., Lindenstrauss J. Extensions of Lipshitz mapping into Hilbert Space // Modern Analysis and Probability. Amer.Math. Soc. 1984. V. 26. P. 189–206.
12. Achlioptas D. Database-friendly random projections // PODS’01. Amer. Math. Soc. 2001. P. 274–281.
13. Peng H.C., Long F., Ding C. Feature Selection Based on Mutual Information: Criteria of Max-dependency, Max-relevance, and Min-redundancy // IEEE Trans. Pattern Anal. Machine Intelligence. 2005. V. 27. No. 8. P. 1236–1238.
14. Zhang Y., Li S., Wang T., Zhang Z. Divergence-based Feature Selection for Separate Classes // Neurocomputing. 2013. V. 101. P. 32–42.
15. Vidyasagar M. Randomized Algorithms for Robust Controller Synthesis Using Statistical Learning Theory: A Tutorial Overview // Eur. J. Cont. 2001. V. 7. No. 2. P. 287–310.
16. Granichin O.N., Polyak B.T. Randomizirovannye algoritmy otsenivaniya i optimizatsii pri pochti proizvol'nykh pomekhakh. M.: Nauka, 2003.
17. Feller V. Vvedenie v teoriyu veroyatnostej i ee prilozheniya. M.: Mir, 1967.
18. Popkov Yu.S. Macrosystems theory and its applications (Lecture notes in control and information sciences. V. 203). Springer, 1995.
19. Rubinstein R.Y., Kroese D.P. The Cross-Entropy Method. N.Y.: Springer Science+Business Media, 2004.
20. Shannon C.E. Communication theory of secrecy systems // Bell Labs Techn. J. 1949. V. 28. No. 4. P. 656–715.
21. Jaynes E.T. Information theory and statistical mechanics // Phys. Rev. 1957. V. 106. No. 4. P. 620–630.
22. Magnus J.R., Neudecker H. Matrix Differential Calculus with Applications in Statistics and Econometrics // Wiley series in probability and math. statist. 1988.
23. Kullback S., Leibler R.A. On Information and Sufficiency // Ann. Math. Statist. 1951. V. 22. No. 1. P. 79–86.