최소 단어 이상 선택하여야 합니다.
최대 10 단어까지만 선택 가능합니다.
다음과 같은 기능을 한번의 로그인으로 사용 할 수 있습니다.
NTIS 바로가기The journal of the institute of internet, broadcasting and communication : JIIBC, v.20 no.6, 2020년, pp.175 - 182
김송국 (인하대학교 컴퓨터정보공학과) , 이필규 (인하대학교 컴퓨터정보공학과)
The non-contact eye tracking is a nonintrusive human-computer interface providing hands-free communications for people with severe disabilities. Recently. it is expected to do an important role in non-contact systems due to the recent coronavirus COVID-19, etc. This paper proposes a novel approach f...
R.G. Bozomitu, A. Pasarica, D. Tarniceriu and C. Rotariu, Development of an Eye Tracking-Based Human-Computer Interface for Real-Time Applications, Sensors, Vol.19, 2019. DOI:https:/ /doi.org/ 10.3390/s19163630.
J. Xu, X. Zhang, and M. Zhou, A High-Security and Smart Interaction System Based on Hand Gesture Recognition for Internet of Things, Security and Communication Networks, 2018. DOI:https://doi.org/10.1155/ 2018/4879496
T. Morris, V. Chauhan, Facial feature tracking for cursor control. J Netw Comput Appl 29:62–80 , 2006. DOI: https://doi.org/10.1016/j.jnca.2004.07.003
Y. Fu, T.S.Huang, HMouse: Head tracking driven virtual computer mouse. In: Proceedings – IEEE Workshop on Applications of Computer Vision, WACV 2007, 2007.
CZ.Li, CK. Kim, JS. Park, The indirect keyboard control system by using the gaze tracing based on haar classifier in opencv. In: Proceedings - 2009 International Forum on Information Technology and Applications, IFITA 2009. pp 362–366, 2009.
A. Bulling, H. Gellersen, Toward mobile eye-based human-computer interaction. IEEE Pervasive Comput Vol.9, pp8–12, 2010. DOI: https:// doi.org/10.1109/MPRV.2010.86
W. Sewell W, Komogortsev O, Real-time eye gaze tracking with an unmodified commodity webcam employing a neural network. In: Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems - CHI EA '10. p 3739, 2010.
Bulling A, Roggen D, Troster G,What's in the eyes for context-awareness? IEEE Pervasive Comput., Vol.10, pp.48–57, 2011. DOI:https://doi .org/10.1109/MPRV.2010.49
Hansen DW, Ji Q, In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Trans Pattern Anal Mach Intell., Vol.32, pp.478–500, 2010. DOI:https://doi.org/10.1109/TPAMI.2009.30
Duchowski AT, A breadth-first survey of eye-tracking applications. Behav Res Methods, Instruments, Comput., Vol.34, pp.455–470, 2002. DOI:https://doi.org/10.3758/BF03195475
Wang JG, Sung E, Study on eye gaze estimation. IEEE Trans Syst Man, Cybern Part B Cybern, Vol.32, pp.332–350, 2002. DOI:https://doi.org/10.1 109/TSMCB.2002.999809
Truong MTN, Kim S, Parallel implementation of color-based particle filter for object tracking in embedded systems. Human-centric Comput. Inf. Sci. 7, 2017.
Zhiwei Zhu, Qiang Ji,Novel Eye Gaze Tracking Techniques Under Natural Head Movement. IEEE Trans Biomed Eng., Vol.54, pp.2246–2260, 2007. DOI:https://doi.org/10.1109/TBME.2007.895750
Guestrin ED, Eizenman M, General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Trans Biomed Eng., Vol.53, pp.1124–1133, 2006. DOI:https://doi.org/10.1109/TBM E.2005.863952
Model D, Eizenman M, An automatic personal calibration procedure for advanced gaze estimation systems. IEEE Trans Biomed Eng., Vol.57, pp.1031–1039, 2009. DOI:https://doi.org/10.1109/TBME. 2009.2039351
Morimoto CH, Mimica MRM, Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst., Vol.98, pp.4–24, 2005. DOI:https://doi.org/10.1016/j.cviu.2004.07.010
Hong S, Khim S, Rhee PK, Efficient facial landmark localization using spatial-contextual AdaBoost algorithm. J Vis Commun Image Represent., Vol.25, pp.1366–1377, 2014.DOI: https://doi.org/10.1016/j.jvcir.2014.05.001
Lohse GL, Consumer eye movement patterns on yellow pages advertising. J Advert, Vol.26, pp.61–73, 1997. DOI: https://doi.org/10.1080/00913367.1997.10673518
Rayner K, Clifton C, Irwin D, Rayner K, Eye movements in reading and information processing: 20 years of research. Psychol Bull, Vol.124, pp.372–422, 1998. DOI: https://doi.org/10.1037/0033 -2909. 124.3.372
Goldberg JH, Kotval XP, Computer interface evaluation using eye movements: Methods and constructs. Int J Ind Ergon, Vol.24, pp.631–645, 1999. DOI: https://doi.org/10.1016/S0169 –8141(98)00068-7
Eizenman M, Yu LH, Grupp L, et al, A naturalistic visual scanning approach to assess selective attention in major depressive disorder. Psychiatry Res, pp.117–128, 2003. DOI: https://doi.org/10.1016/S0165-1781(03)00068-4
MacKenzie IS (1992) Fitts' Law as a Research and Design Tool in Human-Computer Interaction. Human–Computer Interact Vol.7, pp91–139, 1992. DOI:https://doi.org/10.1207/s15327051hci0701_3
Fitts PM, The information capacity of the human motor system in controlling the amplitude of movement. J Exp Psychol, Vol.47, pp.381–391, 1954. DOI: https://doi.org/10.1037/h0055392
Gi-Woo Kim, Dea-Seong Kang, An Implementation of Object Detection and Tracking Algorithm Using a Fusion Method of SURF and Kalman Filter." The Journal of KIIT, Vol. 13, No. 2, pp. 59-64, 2015. DOI: 10.14801/jkiit.2015.13.2.59
*원문 PDF 파일 및 링크정보가 존재하지 않을 경우 KISTI DDS 시스템에서 제공하는 원문복사서비스를 사용할 수 있습니다.
출판사/학술단체 등이 한시적으로 특별한 프로모션 또는 일정기간 경과 후 접근을 허용하여, 출판사/학술단체 등의 사이트에서 이용 가능한 논문
※ AI-Helper는 부적절한 답변을 할 수 있습니다.