Number of purchasers: 0, views: 350
Readers community rating: votes 0
1. Alshear O. Brain wave sensors for every body // 2018. DOI: 10.13140/RG.2.2.22223.69280. https://www.researchgate.net/publication/311582768_Brain_Wave_Sensors_for_Every_Body.
2. Allison B., Dunne S., Leeb R., Millan J., Nijholt A. Towards practical brain-computer interfaces: bridging the gap from research to real-world applications. Berlin: Springer Berlin Heidelberg, 2012. DOI: 10.1007/978-3-642-29746-5.
3. Banerjee A., Monalisa P., Shreyasi D., Tibarewala D., Konar A. Voluntary eye movement controlled electrooculogram based multitasking graphical user interface // International journal of biomedical engineering and technology. 2015. V. 3. № 18. DOI: 10.1504/IJBET.2015.070574.
4. Bauer G., Gerstenbrand F., Rumpl E. Varieties of the locked-in syndrome // Journal of neurology. 1979. V. 9. № 221. P. 77–91. DOI: 10.1007/BF00313105.
5. Bauernfeind G., Leeb R., Wriessnegger S., Pfurtscheller G. Development, set-up and first results for a one-channel near-infrared spectroscopy system // Biomedizinische technik. 2008. V. 1. № 53. P. 36–43. DOI: 10.1515/bmt.2008.005.
6. Bates R., Istance H. Why are eye mice unpopular? — a detailed comparison of head and eye controlled assistive technology pointing devices // Universal access in the information society. 2003. V. 3. № 2. P. 280–290. DOI: 10.1007/s10209-003-0053-y.
7. Beesley T., Pearson D., Le Pelley M. Eye tracking as a tool for examining cognitive processes // Biophysical measurement in experimental social science research. Academic Press, 2019. P. 1–30.
8. Bleichner M., Jansma J., Salari E., Freudenburg Z., Raemaekers M., Ramsey N. Classification of mouth movements using 7 t fMRI // Journal of neural engineering. 2015. V. 6. № 12. DOI: 10.1088/1741-2560/12/6/066026.
9. Bleichner M., Jansma J., Sellmeijer J., Raemaekers M., Ramsey N. Give me a sign: decoding complex coordinated hand movements using high-field fMRI // Brain topography. 2013. № 27. P. 248–257.
10. Coyle S., Ward T., Markham C., Mcdarby G. Give on the suitability of near-infrared (nir) systems for next-generation brain–computer interfaces // Physiological measurement. 2004. V. 4. № 25. P. 815–822. DOI: 10.1088/0967-3334/25/4/003.
11. Dhakal V., Feit A., Kristensson P., Oulasvirta A. Observations on typing from 136 million keystrokes // Proceedings of the 36th ACM conference on human factors in computing systems. ACM Press, 2018. P. 1–12.
12. Esteves A., Velloso E., Bulling A., Gellersen H. Orbits: gaze interaction for smart watches using smooth pursuit eye movements // Proceedings of the 28th annual ACM symposium on user interface software & technology. 2015. DOI: 10.1145/2807442.2807499.
13. Farwell L., Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials // Electroencephalogr clin neurophysiol. 1988. V. 6. № 70. P. 510–523. DOI: 10.1016/0013-4694(88)90149-6.
14. Fedorova А.А., Shishkin S.L., Nuzhdin Y.O., Velichkovsky B.M. Gaze based robot control: the communicative approach // The international IEEE/EMBS conference on neural engineering (ner). 2015. DOI: 10.1109/ner.2015.7146732.
15. Gallegos-Ayala G., Furdes A., Takano K., Ruf C. How many people could use an SSVEP BCI? // Neurology. 2014. V. 21. № 82. P. 1930–1932. DOI: 10.1212/WNL.0000000000000449.
16. Goossens C., Crain S. Overview of nonelectronic eye-gaze communication techniques // Augmentative and alternative communication. 1987. V. 2. № 3. P. 77–89. DOI: 10.1080/07434618712331274309
17. Guger C., Allison B., Growindhager B., Pruckl R., Hintermuller C., Kapeller C., Bruckner M., Krausz G., Edlinger G. How many people could use an SSVEP BCI? // Front neurosci. 2012. V. 169. № 6. P. 1–6. DOI: 10.1080/07434618712331274309.
18. Hutchinson T.E., White K.P., Martin W.N., Reichert K.C., Frey L.A. Human-computer interaction using eye-gaze input // IEEE Transactions on systems, man, and cybernetics. 1989. V. 19. №. 6. P. 1527–1534. DOI: 10.1109/21.44068.
19. Hwang H., Lim J., Jung Y., Choi H., Lee S., Im C. Development of an SSVEP-based BCI spelling system adopting a qwerty-style led keyboard // J neurosci methods. 2012. V. 1. № 208. P. 59–65. DOI: 10.1016/j.jneumeth.2012.04.011.
20. Jacob R. Eye movement-based human-computer interaction techniques: toward non-command interfaces // Advances in human-computer interaction. 1993. № 4. P. 151–190. DOI: 10.1126/science.929199.
21. Jobsis F. Noninvasive, infrared monitoring of cerebral and myocardial oxygen sufficiency and circulatory parameters // Science. 1977. V. 4323. № 198. P. 1264–1270. DOI: 10.1126/science.929199.
22. Khalaf A., Sejdic E., Akcakaya M. Brain-computer a novel motor imagery hybrid brain computer interface using eeg and functional transcranial doppler ultrasound // Journal of neuroscience methods. 2019. № 313. P. 44–53. DOI: 10.1016/j.jneumeth.2018.11.017.
23. Kubler A., Neumann N., Kaiser J., Kotchoubey B. Brain-computer communication: self-regulation of slow cortical potentials for verbal communication // Archives of physical medicine and rehabilitation. 2001. V. 11. № 82. P. 1533–1539. DOI: 10.1053/apmr.2001.26621.
24. Kurauchi A., Feng W., Joshi A., Morimoto C., Betke M. Eyeswipe: dwell-free text entry using gaze paths // Proceedings of the 2016 chi conference on human factors in computing systems. 2016. DOI: 10.1145/2858036.2858335.
25. Lee K., Chang W., Kim S., Im C. Real-time “eye-writing” recognition using electrooculogram // Ieee transactions on neural systems and rehabilitation engineering. 2017. V. 1. № 25. P. 37–48. DOI: 10.1109/tnsre.2016.2542524.
26. Lledo L., Ubeda A., Ianez E., Azorin J. Internet browsing application based on electrooculography for disabled people // Expert systems with applications. 2013. V. 7. № 40. P. 2640–2648. DOI: 10.1016/j.eswa.2012.11.012
27. Majaranta P., Kari-jouko R. Twenty years of eye typing // Proceedings of the symposium on eye tracking research and applications / New york: ACM, 2002. P. 15–22.
28. Matthews P., Jezzard P. Functional magnetic resonance imaging // Journal of neurology, neurosurgery, and psychiatry. 2004. № 75. P. 6–12. DOI: 10.1142/9781860948961_0015.
29. Mishchenko Y., Kaya M., Ozbay E., Yanar H. Developing a three- to six-state EEG-based brain-computer interface for a virtual robotic manipulator control // IEEE trans biomed eng. 2019. V. 4. № 66. P. 977–987. DOI: 10.1109/TBME.2018.2865941.
30. Naci L., Owen A. Making every word count for nonresponsive patients // Jama neurology. 2013. V. 10. № 70. P. 1235–1241. DOI: 10.1001/jamaneurol.2013.3686.
31. Naseer N., Hong M., Hong K. Online binary decision decoding using functional near-infrared spectroscopy for the development of brain–computer interface // Experimental brain research. 2013. V. 2. № 232. P. 555–564. DOI: 10.1007/s00221-013-3764-1.
32. Nijboer F. Technology transfer of brain-computer interfaces as assistive technology: barriers and opportunities // Annals of physical and rehabilitation medicine. 2015. V. 1. № 58. P. 35–38. DOI: https://doi.org/10.1016/j.rehab.2014.11.001.
33. Protzak J., Ihme H., Zander T. A passive brain-computer interface for supporting gaze-based human-machine interaction // Universal access in human-computer interaction. design methods, tools, and interaction techniques for elnclusion. UAHCI 2013. Lecture notes in computer science. Berlin: Springer, 2013. P. 662–671.
34. Qiuping D., Kaiyu T., Guang L. Development of an EOG (electro-oculography) based human-computer interface // 27th annual international conference of the engineering in medicine and biology society, IEEE-EMBS. 2005. P. 6829–6831. DOI: doi.org/10.1007/978-3-642-39188-0_71.
35. Sengupta K., Menges R., Kumar C., Staab S. Gaze the key: interactive keys to integrate word predictions for gaze-based text entry // The 22nd annual meeting of the intelligent user interfaces (iui 2017). 2017. DOI: 10.1145/3030024.3038259.
36. Shishkin S.L., Nuzhdin Y.O., Svirin E.P., Trofimov A.G., Fedorova A.A., Kozyrskiy B.L., Velichkovsky B.M. EEG negativity in fixations used for gaze-based control: Toward converting intentions into actions with an eye-brain-computer interface // Frontiers in neuroscience, 2016. 10, 528.
37. Sorger B., Dahmen B., Reithler J., Gosseries O. Gazethekey: interactive keys to integrate word predictions for gaze-based text entry // Progress in brain research. 2009. № 177. P. 275–292. DOI: 10.1016/S0079-6123(09)17719-1.
38. Speier W., Chandravadia N., Roberts D., Pendekanti S., Pouratian N. Online BCI typing using language model classifiers by ALS patients in their homes // Brain-computer interfaces. 2016. V. 2. № 4. P. 114–121. DOI: 10.1080/2326263x.2016.1252143.
39. Sun X., Huang S., Wang N. Neural interface: frontiers and applications: cochlear implants // Adv exp med biol. 2019. № 1101. P. 167–206. DOI: 10.1007/978-981-13-2050-7_7.
40. Tuisku O., Majaranta P., Isokoski P., Raiha K. Now dasher! Dash away!: longitudinal study of fast text entry by eye gaze // Etra '08: Proceedings of the 2008 symposium on eye tracking research & applications. 2008. P. 19-26. DOI: https://doi.org/10.1145/1344471.1344476.
41. Ubeda A., Ianez E., Azorin J. Wireless and portable eog-based interface for assisting disabled people // Ieee/asme transactions on mechatronics. 2011. V. 5. № 26. P. 870–873. DOI: 10.1109/tmech.2011.2160354
42. Wolpaw J., Birbaumer N., Mcfarland D., Pfurtscheller G., Vaughan T. Brain–computer interfaces for communication and control // Clinical neurophysiology. 2002. V. 6. № 113. P. 767–791. doi: 10.1016/s1388-2457(02)00057-3.