Yıl: 2022 Cilt: 28 Sayı: 5 Sayfa Aralığı: 632 - 642 Metin Dili: İngilizce DOI: 10.5505/pajes.2022.10280 İndeks Tarihi: 25-10-2022

A human-computer interaction system based on eye, eyebrow and head movements

Öz:
Devices like computers that require manual control are difficult to use by disabled people such as having multiple sclerosis (MS), amyotrophic lateral sclerosis (ALS), partial stroke, etc. These people have a very limited movement capability such that they can do most of the interaction with a limited movement of head and the eye. Assistive technologies are very important for people with disabilities not to be dependent on anyone in daily life. In this study, we present a human computer interaction application that works based on analysis of the head and eye/eyebrow movements from real-time images captured by a visual camera. We propose Difference Between Eye and Eyebrow (DEEB) features to detect the action intention of the user and properly realize the computer keyboard and mouse actions based on eyebrow, eye and head movements. In addition, there are shortcut keys in the designed interface that facilitate access to Instagram, WhatsApp, YouTube etc. which are considered to make it easier for individuals with disabilities to communicate with their social environment. We obtained satisfactory results on the designed interface in the experimental study.
Anahtar Kelime:

Göz, kaş ve baş hareketlerine dayalı bir insan-bilgisayar etkileşimi sistemi

Öz:
Manuel kontrol gerektiren bilgisayar gibi cihazların, Multipl Skleroz (MS), Amiyotrofik Lateral Skleroz (ALS), kısmi felç gibi engelli kişiler tarafından kullanılması zordur. Bu kişiler sınırlı hareket kabiliyetine sahip olduklarından çevresiyle iletişimlerinin çoğunu kısıtlı baş ve göz hareketleri ile gerçekleştirebilmektedirler. Engellilerin günlük hayatta kimseye bağımlı olmaması için yardımcı teknolojiler oldukça önemlidir. Bu çalışmada, görsel kamera ile elde edilmiş gerçek zamanlı görüntülerden baş ve göz/kaş hareketlerinin analizine dayalı çalışan bir insan-bilgisayar etkileşimi uygulaması sunulmaktadır. Kullanıcının eylem niyetini tespit etmek ve kaş, göz ve baş hareketlerine göre bilgisayar klavyesi ve fare eylemlerini doğru bir şekilde gerçekleştirmek için Göz ve Kaş Arasındaki Fark (DEEB) özniteliği önerilmiştir. Ayrıca tasarlanan arayüzde, engelli bireylerin sosyal çevreleriyle iletişim kurmasını kolaylaştırdığı düşünülen Instagram, WhatsApp, YouTube vb. sosyal platformlara erişimlerini kolaylaştıran kısayol tuşları da bulunmaktadır. Deneysel çalışmada tasarlanan arayüzde tatmin edici sonuçlar elde edilmiştir.
Anahtar Kelime:

Belge Türü: Makale Makale Türü: Araştırma Makalesi Erişim Türü: Erişime Açık
  • [1] Rapp A, Cena F, Castaldo R, Keller R, Tirassa M. “Designing technology for spatial needs: Routines, control and social competences of people with autism.” International Journal of Human-Computer Studies, 120, 49-65, 2018.
  • [2] Jacko JA. Human Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, 3rd ed. London, England, Taylor and Francis Inc, 2012.
  • [3] Evans DG, Drew R, Blenkhorn P. “Controlling mouse pointer position using an infrared head-operated joystick”. IEEE Transactions on rehabilitation engineering, 8(1), 107-117, 2000.
  • [4] Wook Kim Y, Hyun Cho J. “A novel development of head set type computer mouse using gyro sensors for the handicapped”. In 2nd Annual International IEEE-EMBS Special Topic Conference on Microtechnologies in Medicine and Biology, Madison, USA, 2-4 May 2002.
  • [5] Nowosielski A. “3-steps keyboard: reduced interaction interface for touchless typing with head movements”. In International Conference on Computer Recognition Systems, Polanica Zdroj, Poland, 22-24 May 2017.
  • [6] Chen YL. “Application of tilt sensors in human-computer mouse interface for people with disabilities”. IEEE Transactions on neural systems and rehabilitation engineering, 9(3), 289-294, 2001.
  • [7] Alhamzawi, HA. “Control Mouse Cursor by Head Movement: Development and Implementation”. Applied Medical Informatics, 40(3-4), 39-44, 2018.
  • [8] Jacob RJ. “What you look at is what you get: eye movement-based interaction techniques”. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Seattle, USA, April, 1990.
  • [9] Hutchinson TE, White KP, Martin WN, Reichert KC, Frey LA. “Human-computer interaction using eye-gaze input”. IEEE Transactions on Systems, Man, and Cybernetics, 19(6), 1527-1534, 1989.
  • [10] Nasor M, Rahman KM, Zubair MM, Ansari H, Mohamed F. “Eye-controlled mouse cursor for physically disabled individual”. In 2018 Advances in Science and Engineering Technology International Conferences (ASET), Abu Dhabi, United Arab Emirates, 6 February-5 April 2018.
  • [11] Singh H, Singh J. “Real-time eye blink and wink detection for object selection in HCI systems”. Journal on Multimodal User Interfaces, 12(1), 55-65, 2018.
  • [12] Hegde VN, Ullagaddimath RS, Kumuda S. “Low cost eye based human computer interface system (Eye controlled mouse)”. In 2016 IEEE Annual India Conference (INDICON), Bangalore, India, 16-18 December 2016.
  • [13] Kocejko T, Bujnowski A, Wtorek J. “Eye-mouse for disabled”. Human-Computer Systems Interaction, Uppsala, Sweden, August 24-28, 2009.
  • [14] Lin CS, Lin CH, Lay YL, Yeh MS, Chang HC. “Eye-Controlled virtual keyboard using a new coordinate transformation of long and narrow region”. Optica Applicata, 38(2), 481-489, 2008.
  • [15] Chen S, Liu C. “Eye detection using discriminatory Haar features and a new efficient SVM”. Image and Vision Computing, 33, 68-77, 2015.
  • [16] Saraswati VI, Sigit R, Harsono T. “Eye gaze system to operate virtual keyboard”. International Electronics Symposium (IES), Denpasar, Indonesia, 29-30 September 2016.
  • [17] Attiah, A, Khairullah, E. F. Eye-Blink Detection System for Virtual Keyboard. IEEE 2021 National Computing Colleges Conference (NCCC), Taif, Saudi Arabia, 27-28 March 2021.
  • [18] King D, “dlib-models”. https://github.com/davisking/dlib-models, (27.09.2020)
  • [19] Anaya DFV, Yuce MRA. “Hands-Free human-computer ınterface platform for paralyzed patients using a teng based eyelash motion sensor”. 42nd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Montreal, Canada 20-24 July 2020.
  • [20] Sesin A, Adjouadi M, Cabrerizo M, Ayala M, Barreto A. “Adaptive eye-gaze tracking using neural-network-based user profiles to assist people with motor disability”. The Journal of Rehabilitation Research and Development, 45(6), 801-818, 2008.
  • [21] Bissoli A, Lavino-Junior D, Sime M, Encarnação L, Bastos Filho T. “A human–machine interface based on eye tracking for controlling and monitoring a smart home using the ınternet of things”. Sensors, 19(4), 859-884, 2019.
  • [22] Lee JC, Tan DS. “Using a low-cost electroencephalograph for task classification in HCI research”. In Proceedings of the 19th Annual ACM Symposium on User Interface Software and Technology, Montreux, Switzerland, 15-18 October 2006.
  • [23] Knězík J, Drahanský M. “Simple EEG driven mouse cursor movement”. Computer Recognition Systems, 2, 526-531, 2007.
  • [24] Blasco JS, Iánez E, Ubeda A, Azorín JM. “Visual evoked potential-based brain-machine interface applications to assist disabled people”. Expert Systems with Applications, 39(9), 7908-7918, 2012.
  • [25] Salih, T. A, Abdal, Y. M. Brain computer interface based smart keyboard using neurosky mindwave headset. Telkomnika, 18(2), 919-927, 2020.
  • [26] Spüler M. “A Brain-Computer Interface (BCI) system to use arbitrary Windows applications by directly controlling mouse and keyboard”. 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 25-29 August 2015.
  • [27] Pathirana S, Asirvatham D, Johar MGM. “Designing virtual keyboards for brain-computer ınterfaces”. IEEE Region 10 Humanitarian Technology Conference (R10-HTC), Colombo, Sri Lanka, 6-8 December 2018.
  • [28] Hosni SM, Shedeed HA, Mabrouk MS, Tolba MF. “EEG-EOG based virtual keyboard: Toward hybrid brain computer interface”. Neuroinformatics, 17, 323-341, 2018.
  • [29] Alomari MH, AbuBaker A, Turani A, Baniyounes AM, Manasreh A.“EEG mouse: A machine learning-based brain computer interface”. International Journal of Advanced Computer Science and Applications, 5(4), 193-198, 2014.
  • [30] Abiyev RH, Arslan M. “Head mouse control system for people with disabilities”. Expert Systems, 37(3), 1-14, 2019.
  • [31] Castillo A, Cortez G, Diaz D, Espíritu R, Ilisastigui K, O'Bard B, George K. “Hands free mouse”. IEEE 13th International Conference on Wearable and Implantable Body Sensor Networks (BSN), San Francisco, USA, 14-17 June 2016.
  • [32] Soundarajan S, Cecotti H. “A gaze-based virtual keyboard using a mouth switch for command selection”. 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, USA, 18-21 July 2018.
  • [33] Cecotti H, Meena YK, Prasad G. “A multimodal virtual keyboard using eye-tracking and hand gesture detection”. 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, USA, 18-21 July 2018.
  • [34] Naizhong Z, Jing W, Jun W. “Hand-Free head mouse control based on mouth tracking”. 10th International Conference on Computer Science & Education (ICCSE), Cambridge, United Kingdom, 22-24 July 2015.
  • [35] Soukupova T, Cech J. “Eye blink detection using facial landmarks”. 21st Computer Vision Winter Workshop, Rimske Toplice, Slovenia, 3-5 February 2016.
  • [36] Cevikalp H, Yavuz HS, Edizkan R, Gündüz H, Kandemir CM. "Comparisons of features for automatic eye and mouth localization". International Symposium on Innovations in Intelligent Systems and Applications, Istanbul, Turkey, 15-18 June 2011.
  • [37] Yavuz HS, Çevikalp H, Edizkan R. "Automatic face recognition from frontal images". 21st IEEE Signal Processing and Communications Applications Conference, Girne, Turkish Republic of Northern Cyprus, 24-26 April 2013.
  • [38] Kawulok M, Celebi ME, Smołka B. Advances in face detection and facial image analysis, 1st ed. New York, USA, Springer Publishing Company, 2016.
  • [39] Su J, Gao L, Li W, Xia Y, Cao N, Wang R. “Fast face tracking by-detection algorithm for secure monitoring”. Applied Sciences, 9(18), 3774-3793, 2019.
  • [40] Peng C, Bu W, Xiao J, Wong K, Yang M.“An improved neural network cascade for face detection in large scene surveillance”. Applied Sciences, 8(11), 2222-2235, 2018.
  • [41] Sagonas C, Tzimiropoulos G, Zafeiriou S, Pantic M. “300 faces in-the-wild challenge: The first facial landmark localization challenge”. In Proceedings of the IEEE International Conference on Computer Vision Workshops, Sydney, Australia, 2-8 December 2013.
  • [42] T.C. Aile, Çalışma ve Sosyal Hizmetler Bakanlığı. “Ekranlı Araçlarla Çalışmalarda İş Sağlığı ve Güvenliği Rehberi”. https://www.ktu.edu.tr/dosyalar/isgb_2333a.pdf (11.11.2021).
  • [43] Meena YK, Cecotti H, Wong-Lin K, Prasad G. “A novel multimodal gaze-controlled hindi virtual keyboard for disabled users”. IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, Hungary, 9-12 October 2016.
APA Tas M, Yavuz H (2022). A human-computer interaction system based on eye, eyebrow and head movements. , 632 - 642. 10.5505/pajes.2022.10280
Chicago Tas Muhammed Oguz,Yavuz Hasan Serhan A human-computer interaction system based on eye, eyebrow and head movements. (2022): 632 - 642. 10.5505/pajes.2022.10280
MLA Tas Muhammed Oguz,Yavuz Hasan Serhan A human-computer interaction system based on eye, eyebrow and head movements. , 2022, ss.632 - 642. 10.5505/pajes.2022.10280
AMA Tas M,Yavuz H A human-computer interaction system based on eye, eyebrow and head movements. . 2022; 632 - 642. 10.5505/pajes.2022.10280
Vancouver Tas M,Yavuz H A human-computer interaction system based on eye, eyebrow and head movements. . 2022; 632 - 642. 10.5505/pajes.2022.10280
IEEE Tas M,Yavuz H "A human-computer interaction system based on eye, eyebrow and head movements." , ss.632 - 642, 2022. 10.5505/pajes.2022.10280
ISNAD Tas, Muhammed Oguz - Yavuz, Hasan Serhan. "A human-computer interaction system based on eye, eyebrow and head movements". (2022), 632-642. https://doi.org/10.5505/pajes.2022.10280
APA Tas M, Yavuz H (2022). A human-computer interaction system based on eye, eyebrow and head movements. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, 28(5), 632 - 642. 10.5505/pajes.2022.10280
Chicago Tas Muhammed Oguz,Yavuz Hasan Serhan A human-computer interaction system based on eye, eyebrow and head movements. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi 28, no.5 (2022): 632 - 642. 10.5505/pajes.2022.10280
MLA Tas Muhammed Oguz,Yavuz Hasan Serhan A human-computer interaction system based on eye, eyebrow and head movements. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, vol.28, no.5, 2022, ss.632 - 642. 10.5505/pajes.2022.10280
AMA Tas M,Yavuz H A human-computer interaction system based on eye, eyebrow and head movements. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi. 2022; 28(5): 632 - 642. 10.5505/pajes.2022.10280
Vancouver Tas M,Yavuz H A human-computer interaction system based on eye, eyebrow and head movements. Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi. 2022; 28(5): 632 - 642. 10.5505/pajes.2022.10280
IEEE Tas M,Yavuz H "A human-computer interaction system based on eye, eyebrow and head movements." Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi, 28, ss.632 - 642, 2022. 10.5505/pajes.2022.10280
ISNAD Tas, Muhammed Oguz - Yavuz, Hasan Serhan. "A human-computer interaction system based on eye, eyebrow and head movements". Pamukkale Üniversitesi Mühendislik Bilimleri Dergisi 28/5 (2022), 632-642. https://doi.org/10.5505/pajes.2022.10280