Spontaneous gaze interaction based on smooth pursuit eye movement using difference gaze pattern method

Main Article Content

Suatmi Murnani
Noor Akhmad Setiawan
Sunu Wibirama

Abstract

Human gaze is a promising input modality for being able to be used as natural user interface in touchless technology during Covid-19 pandemic. Spontaneous gaze interaction is required to allow participants to directly interact with an application without any prior eye tracking calibration. Smooth pursuit eye movement is commonly used in this kind of spontaneous gaze-based interaction. Many studies have been focused on various object selection techniques in smooth pursuit-based gaze interaction; however, challenges in spatial accuracy and implementation complexity have not been resolved yet. To address these problems, we then proposed an approach using difference patterns between gaze and dynamic objects' trajectories for object selection named Difference Gaze Pattern method (DGP). Based on the experimental results, our proposed method yielded the best object selection accuracy of  and success time of  ms. The experimental results also showed the robustness of object selection using difference patterns to spatial accuracy and it was relatively simpler to be implemented. The results also suggested that our proposed method can contribute to spontaneous gaze interaction.

Downloads

Download data is not yet available.

Article Details

How to Cite
Murnani, S., Setiawan, N. A., & Wibirama, S. (2022). Spontaneous gaze interaction based on smooth pursuit eye movement using difference gaze pattern method. Communications in Science and Technology, 7(1), 8-14. https://doi.org/10.21924/cst.7.1.2022.739
Section
Articles

References

1. F. Trapsilawati, T. Wijayanto, and E. S. Jourdy. Human-computer trust in navigation systems: Google maps vs waze. Commun. Sci. Technol., 1 (2019) 38–43.

2. A. Wijaya, T. B. Adji, and N. A. Setiawan. Improving multi-class eegmotor imagery classification using two-stage detection on one-versus-one approach. Commun. Sci. Technol., 2 (2020) 85–92.

3. S. Wibirama, S. Murnani, I. D. Sukowati, and R. Ferdiana. Gaze-controlled digital signage for public health education during covid-19 pandemic. In 2021 9th International Conference on Information and Commun. Technology (ICoICT), IEEE, 2021, 7–12.

4. C. Ware and H. H. Mikaelian. An evaluation of an eye tracker as a device for computer input2. ACM Sigchi Bulletin, 4 (1987) 183–188.

5. P. Majaranta and K. Raiha. Twenty years of eye typing: systems and design issues. In Proceedings of the 2002 Symposium on Eye Tracking Research and Applications, ACM, 2002, 15–22.

6. D. D. Salvucci and J. R. Anderson. Intelligent gaze-added interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, 2000, 273–280.

7. S. Stellmach, S. Stober, A. Nurnberger, and R. Dachselt. Designing gazesupported multimodal interactions for the exploration of large image collections. In Proceedings of the 1st conference on novel gaze-controlled applications, ACM, 2011, 1–8.

8. K. Holmqvist, M. Nystrom, R. Andersson, R. Dewhurst, H. Jarodzka, and J. Van de Weijer. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford, 2011, 21–22.

9. The next generation of head tracking and eye tracking. https://gaming.tobii.com/, 2022. Accessed: 2022-02-11.

10. Assistive technology for communication. https://us.tobiidynavox.com/,2022. Accessed: 2022-02-11.

11. H. Drewes and A. Schmidt. Interacting with the computer using gaze gestures. In IFIP Conference on Human-Computer Interaction, Springer, 2007, 475–488.

12. A. Hyrskykari, H. Istance, and S. Vickers. Gaze gestures or dwell-based interaction? In Proceedings of the Symposium on Eye Tracking Research and Applications, ACM, 2012, 229–232.

13. Herlina, S. Wibirama, and I. Ardiyanto. Similarity measures of object selection in interactive applications based on smooth pursuit eye movements. In 2018 International Conference on Information and Commun. Technology (ICOIACT), IEEE, 2018, 639–644.

14. S. Murnani, S. Wibirama, and N. A. Setiawan. Comparative analysis of signal denoising methods for object selection in gaze-based interactive applications. In 2019 5th International Conference on Science and Technology (ICST), IEEE, 2019, 1–6.

15. M. Vidal, A. Bulling, and H. Gellersen. Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets. In Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, ACM, 2013, 439–448.

16. M. Vidal, K. Pfeuffer, A. Bulling, and H. Gellersen. Pursuits: eye-based interaction with moving targets. In CHI’13 Extended Abstracts on Human Factors in Computing Systems, ACM, 2013, 3147–3150.

17. M. Vidal, A. Bulling, and H. Gellersen. Pursuits: spontaneous eye-based interaction for dynamic interfaces. GetMobile: Mobile Computing and Commun., 4 (2015) 8–10.

18. M. Khamis, F. Alt, and A. Bulling. A field study on spontaneous gaze-based interaction with a public display using pursuits. In Adjunct Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2015 ACM International Symposium on Wearable Computers, ACM, 2015, 863–872.

19. M. Khamis, L. Trotter, M. Tessmann, C. Dannhart, A. Bulling, and F. Alt. Eyevote in the wild: do users bother correcting system errors on public displays? In Proceedings of the 15th International Conference on Mobile and Ubiquitous Multimedia, ACM, 2016, 57–62.

20. M. Khamis, O. Saltuk, A. Hang, K. Stolz, A. Bulling, and F. Alt. Textpursuits: using text for pursuits-based interaction and calibration on public displays. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing, ACM, 2016, 274–285.

21. H. Drewes, M. Khamis, and F. Alt. Dialplates: enabling pursuits-based user interfaces with large target numbers. In Proceedings of the 18th International Conference on Mobile and Ubiquitous Multimedia, ACM, 2019, 1–10.

22. S. Wibirama, S. Murnani, and N. A. Setiawan. Spontaneous gaze gesture interaction in the presence of noises and various types of eye movements. In ACM Symposium on Eye Tracking Research and Applications, ACM, 2020, 1–5.

23. H. Nurlatifa, R. Hartanto, and S. Wibirama. Spontaneous gaze-based application using 2d correlation and velocity threshold identification. In 2021 5th International Conference on Informatics and Computational Sciences (ICICoS), IEEE, 2021, 75–80.

24. M. Alfaroby, S. Wibirama, and I. Ardiyanto. Accuracy improvement of object selection in gaze gesture application using deep learning. In 2020 12th International Conference on Information Technology and Electrical Engineering (ICITEE), IEEE, 2020, 307–311.

25. S. Murnani. A robust object selection technique in gaze gesture application using exponential moving average and hidden markov model. Master’s thesis, Universitas Gadjah Mada, Indonesia, 2020.

26. Z. Zeng, F. W. Siebert, A. C. Venjakob, and M. Roetting. Calibration-free gaze interfaces based on linear smooth pursuit. J. Eye Mov. Res. 1 (2020) 1–12.