всего просмотров: 1300
Оценка читателей: голосов 0
1. Айвазян С.А., Мхитарян В.С. Прикладная статистика. Классификация и снижение размерности. М.: Финансы и статистика, 1989.
2. Мерков А.Б. Распознавание образов. Введение в методы статистического обучения. М.: URSS, 2010.
3. Friedman J., Hastie T., Tibshirani R. The elements of statistical learning. Springer series in statistics Springer. Berlin, 2001. V. 1.
4. Popkov Yu.S., Dubnov Yu.A., Popkov A.Yu. Randomized Machine Learning: Statement, Solution, Applications // Intelligent Systems (IS). 2016 IEEE 8th Int. Conf. 2016. P. 27–39.
5. Bruckstein A.M., Donoho D.L., Elad M. From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images // SIAM Rev. 2009. V. 51. No. 1. P. 34–81.
6. Кендалл М., Стьюарт А., Гальчук Л.И., Терехин А.Т. Статистические выводы и связи. М.: Наука. Гл. ред. физ.-мат. лит., 1973.
7. Jolliffe I.T. Principal component analysis. N.Y.: Springer-Verlag, 1986.
8. Comon P., Jutten C. Handbook of Blind Source Separation. Independent Component Analysis and Applications. Oxford: Academic Press, 2010.
9. Berry M.W., Browne M. Algorithms and Applications for Approximate Nonnegative Matrix Factorization // Comput. Statist. Data Analysis. 2007. V. 52. P. 155–173.
10. Поляк Б.Т., Хлебников М.В. Метод главных компонент: робастные версии // АиТ. 2017. № 3. С. 130–148. Polyak B.T., Khlebnikov M.V. Principle Component Analysis: Robust Versions // Autom. Remote Control. 2017. V. 78. No. 3. P. 490–506.
11. Johnson W.B., Lindenstrauss J. Extensions of Lipshitz mapping into Hilbert Space // Modern Analysis and Probability. Amer.Math. Soc. 1984. V. 26. P. 189–206.
12. Achlioptas D. Database-friendly random projections // PODS’01. Amer. Math. Soc. 2001. P. 274–281.
13. Peng H.C., Long F., Ding C. Feature Selection Based on Mutual Information: Criteria of Max-dependency, Max-relevance, and Min-redundancy // IEEE Trans. Pattern Anal. Machine Intelligence. 2005. V. 27. No. 8. P. 1236–1238.
14. Zhang Y., Li S., Wang T., Zhang Z. Divergence-based Feature Selection for Separate Classes // Neurocomputing. 2013. V. 101. P. 32–42.
15. Vidyasagar M. Randomized Algorithms for Robust Controller Synthesis Using Statistical Learning Theory: A Tutorial Overview // Eur. J. Cont. 2001. V. 7. No. 2. P. 287–310.
16. Граничин О.Н., Поляк Б.Т. Рандомизированные алгоритмы оценивания и оптимизации при почти произвольных помехах. М.: Наука, 2003.
17. Феллер В. Введение в теорию вероятностей и ее приложения. М.: Мир, 1967.
18. Popkov Yu.S. Macrosystems theory and its applications (Lecture notes in control and information sciences. V. 203). Springer, 1995.
19. Rubinstein R.Y., Kroese D.P. The Cross-Entropy Method. N.Y.: Springer Science+Business Media, 2004.
20. Shannon C.E. Communication theory of secrecy systems // Bell Labs Techn. J. 1949. V. 28. No. 4. P. 656–715.
21. Jaynes E.T. Information theory and statistical mechanics // Phys. Rev. 1957. V. 106. No. 4. P. 620–630.
22. Magnus J.R., Neudecker H. Matrix Differential Calculus with Applications in Statistics and Econometrics // Wiley series in probability and math. statist. 1988.
23. Kullback S., Leibler R.A. On Information and Sufficiency // Ann. Math. Statist. 1951. V. 22. No. 1. P. 79–86.