Multimodality with Eye tracking and Haptics: A New Horizon for Serious Games?


  • Shujie Deng Bournemouth University
  • Julie A. Kirkby Bournemouth University
  • Jian Chang Bournemouth University
  • Jian Jun Zhang Bournemouth University



Eye tracking, Haptic, Multimodal integration, Virtual reality, Serious games


The goal of this review is to illustrate the emerging use of multimodal virtual reality that can benefit learning-based games. The review begins with an introduction to multimodal virtual reality in serious games and we provide a brief discussion of why cognitive processes involved in learning and training are enhanced under immersive virtual environments. We initially outline studies that have used eye tracking and haptic feedback independently in serious games, and then review some innovative applications that have already combined eye tracking and haptic devices in order to provide applicable multimodal frameworks for learning-based games. Finally, some general conclusions are identified and clarified in order to advance current understanding in multimodal serious game production as well as exploring possible areas for new applications.


Dondlinger M. J., “Educational video game design: A review of the literature,” Journal of Applied Educational Technology, vol. 4, no. 1, pp. 21-31, 2007.

Zyda M., “From visual simulation to virtual reality to games,” Computer, vol. 38, no. 9, pp. 25-32, 2005.

De Freitas S., “Learning in immersive worlds,” London: Joint Information Systems Committee, 2006.

Moreno R., Mayer R., “Interactive Multimodal Learning Environments,” Educational Psychology Review, vol. 19, no. 3, pp. 309-326, 2007.

Guo J., Guo A., “Crossmodal interactions between olfactory and visual learning in Drosophila,” Science, vol. 309, no. 5732, pp. 307-310, 2005.

Sundar S. S., “The MAIN model: A heuristic approach to understanding technology effects on credibility,”. In Miriam J. Metzger and Andrew J. Flanagin (Eds.) Digital media, youth, and credibility, pp. 73-100, Cambridge, MA: MIT Press, 2008.

Sun P.-C., Cheng H. K., “The design of instructional multimedia in e-Learning: A Media Richness Theory-based approach,” Computers & Education, vol. 49, no. 3, pp. 662-676, 2007.

Jaimes A, Sebe N., “Multimodal human–computer interaction: A survey,” Computer Vision and Image Understanding, vol. 108, no. 1–2, pp. 116-134, 2007.

Burdea G., Richard P., Coiffet P., “Multimodal virtual reality: Input?output devices, system integration, and human factors,” International Journal of Human-Computer Interaction, vol. 8, no. 1, pp. 5-24, 1996.

Duchowski A., “A breadth-first survey of eye-tracking applications,” Behavior Research Methods, Instruments, & Computers, vol. 34, no. 4, pp. 455-470, 2002.

Coles T. R., Meglan D., John N. W., “The role of haptics in medical training simulators: a survey of the state of the art,” Haptics, IEEE Transactions on, vol. 4, no. 1, pp. 51-66, 2011.

Xu Z., Yu H., Yan S., "Motor rehabilitation training after stroke using haptic handwriting and games." Proceedings of the 4th International Convention on Rehabilitation Engineering & Assistive Technology, 31. Singapore Therapeutic, Assistive & Rehabilitative Technologies (START) Centre, p. 31:1–31:4, 2010.

Broeren J., Rydmark M., Sunnerhagen K. S., “Virtual reality and haptics as a training device for movement rehabilitation after stroke: a single-case study,” Archives of physical medicine and rehabilitation, vol. 85, no. 8, pp. 1247-1250, 2004.

Veneman J. F., Jung J. H., Perry J. C. et al., "Consistent Arm Rehabilitation from Clinical to Home Environment-Integrating the Universal Haptic Drive into the TeleReha Software Platform," Converging Clinical and Engineering Research on Neurorehabilitation, pp. 1013-1017: Springer, 2013.

Krenek A., Cernohorsky M., Kabelác Z.., “Haptic visualization of molecular model,” In SkalaV. (Eds) WSCG’99 Conference Proceedings, 1999.

Persson P. B, Cooper M. D., Tibell L. A. E. et al., "Designing and Evaluating a Haptic System for Biomolecular Education." In: IEEE Virtual Reality Conference, pp. 171–178, 2007.

Rayner K., “Eye movements in reading and information processing: 20 years of research,” Psychological bulletin, vol. 124, no. 3, pp. 372-422, 1998.

Rayner K., “Eye movements and attention in reading, scene perception, and visual search,” The quarterly journal of experimental psychology, vol. 62, no. 8, pp. 1457-1506, 2009.

Liversedge S. P., Findlay J. M., “Saccadic eye movements and cognition,” Trends in Cognitive Sciences, vol. 4, no. 1, pp. 6-14, 2000.

Granholm E., Steinhauer S. R, “Pupillometric measures of cognitive and emotional processes,” International Journal of Psychophysiology, vol. 52, no. 1, pp. 1-6, 2004.

Palinko O., Kun A. L., Shyrokov A. et al., "Estimating cognitive load using remote eye tracking in a driving simulator." in Proceedings of the Symposium on Eye-Tracking Research and Applications (ETRA '10), pp. 141–144, 2010.

Schultheis H., Jameson A., "Assessing cognitive load in adaptive hypermedia systems: Physiological and behavioral methods.".In: De Bra, P.M.E., Nejdl, W. (eds.) Adaptive hypermedia and adaptive web based system. LNCS, vol. 3137,pp.225–234, Berlin, Springer, 2004.

Smith J. D., Graham T., "Use of eye movements for video game control." In: Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology, p. 20. New York: ACM Press, 2006.

Knoepfle D. T., Wang J. T. y., Camerer C. F., “Studying Learning in Games Using Eye?tracking,” Journal of the European Economic Association, vol. 7, no. 2?3, pp. 388-398, 2009.

Kiili K., Ketamo H., Kickmeier-Rust M. D., “Evaluating the usefulness of Eye Tracking in Game-based Learning,” International Journal of Serious Games, vol. 1, no. 2, 2014.

Conati C., Merten C., “Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation,” Knowledge-Based Systems, vol. 20, no. 6, pp. 557-574, 2007.

Williams C., Henderson J., Zacks R. “Incidental visual memory for targets and distractors in visual search,” Perception & Psychophysics, vol. 67, no. 5, pp. 816-827, 2005.

Vlaskamp B. N., Hooge I. T. C., “Crowding degrades saccadic search performance,” Vision research, vol. 46, no. 3, pp. 417-425, 2006.

Zelinsky, G. J. “Specifying the components of attention in a visual search task,” Neurobiology of attention, pp. 395-400, 2005.

Pfeiffer T., Latoschik M. E, Wachsmuth I., “Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments,” JVRB-Journal of Virtual Reality and Broadcasting, vol. 5, no. 16, 2008.

Paletta L., Santner K., Fritz G. et al., "3d attention: measurement of visual saliency using eye tracking glasses." In CHI '13 Extended Abstracts on Human Factors in Computing Systems, ACM, pp.199-204, 2013.

Alt F., Schneegass S., Auda J. et al., "Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays." in Proceedings of the 19\textsuperscriptth international conference on Intelligent User Interfaces – IUI ’14, New York, New York, USA, pp. 267-272, 2014

Reingold E. M., Loschky L. C., McConkie G. W. et al., “Gaze-contingent multiresolutionaldisplays: An integrative review,” Human Factors: The Journal of the Human Factors and Ergonomics Society, vol. 45, no. 2, pp. 307-328, 2003.

Duchowski A. T., Cournia N., and Murphy H., “Gaze-Contingent displays: A review,” CyberPsychology & Behavior, vol. 7, no. 6, pp. 621-634, 2004.

Posner M. I., “Orienting of attention,” Quarterly journal of experimental psychology, vol. 32, no. 1, pp. 3-25, 1980.

Sundstedt V., Stavrakis E., Wimmer M. et al., “A psychophysical study of fixation behavior in a computer game,” in Proceedings of the 5th symposium on Applied perception in graphics and visualization, Los Angeles, California, 2008, pp. 43-50.

Kawashima T., Terashima T., Nagasaki T. et al., "Enhancing visual perception using dynamic updating of display," Intuitive Human Interfaces for Organizing and Accessing Intellectual Assets, pp. 127-141: Springer, 2005.

Schumacher J., Allison R., Herpers R., "Using saccadic suppression to hide graphic updates." In 10th Eurographics Symposium on Virtual Environments, pp. 17-24, 2004.

Franke I. S., Günther T., Groh R., "Saccade Detection and Processing for Enhancing 3D Visualizations in Real-Time," HCI International 2014-Posters’ Extended Abstracts, pp. 317-322: Springer, 2014.

Watanabe J., Ando H., Maeda T. et al., “Gaze-contingent visual presentation based on remote saccade detection,” Presence: Teleoperators and Virtual Environments, vol. 16, no. 2, pp. 224-234, 2007.

Watanabe J, Maeda T, Ando H, “Gaze-contingent visual presentation technique with electro-ocular-graph-based saccade detection,” ACM Transactions on Applied Perception (TAP), vol. 9, no. 2, pp. 6, 2012.

Triesch J., Sullivan B. T., M. M. Hayhoe et al., "Saccade contingent updating in virtual reality." Proceedings of the Eye Tracking Research and Applications Symposium 2002. pp. 95-102., New York, NY: ACM; 2002.

Hillaire S., Lécuyer A., Cozot R., "Using an eye-tracking system to improve camera motions and depth-of-field blur effects in virtual environments." Virtual Reality Conference, 2008. VR '08. IEEE, vol., no., pp.47,50, 8-12 March 2008

Burelli P., Yannakakis G. N., "Towards adaptive virtual camera control in computer games." In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).vol. 6815 LNCS, pp. 25-36. Springer. 2011

Zhu D., Gedeon T., Taylor K., ““Moving to the centre”: A gaze-driven remote camera control for teleoperation,” Interacting with Computers, vol. 23, no. 1, pp. 85-95, 2011.

Murphy H. A., Duchowski A. T., Tyrrell R. A., “Hybrid image/model-based gaze-contingent rendering,” ACM Trans. Appl. Percept., vol. 5, no. 4, pp. 1-21, 2009.

Kincaid, J. P., & Westerlund, K. K. (2009). "Simulation in education and training". Proceedings of the 2009 Winter Simulation Conference (WSC), pp. 273–280.

Derryberry, A. “Serious games: online games for learning,” Adobe Whitepaper, November, 2007.

Okamura A. M., Richard C., Cutkosky M. R., “Feeling is believing: Using a force-feedback joystick to teach dynamic systems,” Journal of Engineering Education-Washington, vol. 91, no. 3, pp. 345-350, 2002.

Chan M. S., Black J. B., "Learning Newtonian mechanics with an animation game: The role of presentation format on mental model acquisition.".Annual Meeting of the American Educational Research Association (AERA), San Francisco. 2006

Chui C.-K., Ong J. S., Lian Z.-Y. et al., “Haptics in computer-mediated simulation: Training in vertebroplasty surgery,” Simulation & Gaming, vol. 37, no. 4, pp. 438-451, 2006.

De Paolis L. T., "Serious Game for Laparoscopic Suturing Training." pp. 481-485.

Jing Q., Yim-Pan C., Wai-Man P. et al., “Learning Blood Management in Orthopedic Surgery through Gameplay,” Computer Graphics and Applications, IEEE, vol. 30, no. 2, pp. 45-57, 2010.

Goude D., Björk S., Rydmark M., “Game design in virtual reality systems for stroke rehabilitation,” Studies in health technology and informatics, vol. 125, no. 2007, pp. 146-148, 2007.

Delbressine F., Timmermans A., Beursgens L. et al., "Motivating arm-hand use for stroke patients by serious games." in Proceedings of the International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3564–3567, 2012

De Weyer T., Robert K., Renny Octavia Hariandja J. et al., “The Social Maze: A Collaborative Game to Motivate MS Patients for Upper Limb Training,” In Herrlich M., Malaka R., Masuch M Entertainment Computing-ICEC 2012, LNCS vol. 7522, pp. 476-479, Berlin, Springer, 2012.

Pernalete N., Edwards S., Gottipati R. et al., "Eye-hand coordination assessment/therapy using a robotic haptic device." 9th International Conference on Rehabilitation Robotics, ICORR; 2005; Chicago, Illinois. pp. 25–28, 2005

Pernalete N., Tang F., Chang S. M. et al., "Development of an evaluation function for eye-hand coordination robotic therapy." Development of an evaluation function for eye-hand coordination robotic therapy. In: 2011 IEEE International Conference on Rehabilitation Robotics (ICORR), pp. 1–6 , 2011.

Yuan B., Folmer E., “Blind hero: enabling guitar hero for the visually impaired,” in Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, Halifax, Nova Scotia, Canada, 2008, pp. 169-176.

Nemec V., Sporka A., Slavik P., "Haptic and Spatial Audio Based Navigation of Visually Impaired Users in Virtual Environment Using Low Cost Devices," User-Centered Interaction Paradigms for Universal Access in the Information Society, Lecture Notes in Computer Science C. Stary and C. Stephanidis, eds., pp. 452-459: Springer Berlin Heidelberg, 2004.

Yuan B., Folmer E., Harris, Jr. F., “Game accessibility: a survey,” Universal Access in the Information Society, vol. 10, no. 1, pp. 81-100, 2011.

Hayward V., Astley O. R., Cruz-Hernandez M. et al., “Haptic interfaces and devices,” Sensor Review, vol. 24, no. 1, pp. 16-29, 2004.

"Mice that work—and work wonders,"

Schneider C., Mustufa T., Okamura A., “A magnetically-actuated friction feedback mouse,” Proceedings of EuroHaptics 2004, Munich, Germany, pp. 330-337, 2004.

Mackenzie I. S., “Movement characteristics using a mouse with tactile and force feedback,” Int. J. Human–Computer Studies, vol. 45, pp. 483-493, 1996.

Wanjoo P., Sehyung P., Laehyun K. et al., "Haptic Mouse Interface Actuated by an Electromagnet." In: 2011 International Conference on Complex, Intelligent and Software Intensive Systems (CISIS), pp. 643–646, 2011.

Orozco M., Silva J., El Saddik A. et al., “The Role of Haptics in Games,” Haptics Rendering and Applications, Abdulmotaleb El Saddik (Ed), 978-953-307-897-7, InTech pp. 978-953, 2012.

T. Games.

Palan S., "Tactile Gaming Vest (TGV)," 2010. Online at accessed Oct. 31, 2014

Mohellebi H., Kheddar A., Espie S., “Adaptive Haptic Feedback Steering Wheel for Driving Simulators,” Vehicular Technology, IEEE Transactions on, vol. 58, no. 4, pp. 1654-1666, 2009.

Hwang S., Ryu J.-H., "The Haptic steering Wheel: Vibro-tactile based navigation for the driving environment." pp. 660-665.

"OpenHaptics Toolkit Datasheet,"

Conti F., Barbagli F., Balaniuk R. et al., "The CHAI libraries." In: Proceedings of Eurohaptics 2003, pp. 496–500, 2003

"H3DAPI Datasheet,"

"Reachin API,"

ReachinTechnologies. "HaptX,"

De Pascale M., Prattichizzo D., “The Haptik Library: A Component Based Architecture for Uniform Access to Haptic Devices,” Robotics & Automation Magazine, IEEE, vol. 14, no. 4, pp. 64-75, 2007.

Frisoli A., Loconsole C., Leonardis D. et al., “A New Gaze-BCI-Driven Control of an Upper Limb Exoskeleton for Rehabilitation in Real-World Tasks,” Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, vol. 42, no. 6, pp. 1169-1179, 2012.

Troncossi M., Mozaffari Foumashi M., Mazzotti C. et al., “Design and Manufacturing of a Hand-and-Wrist Exoskeleton Prototype for the Rehabilitation of Post-Stroke Patients,” in Quaderni del DIEM–GMA. Atti della Sesta Giornata di Studio Ettore Funaioli, 2012, pp. 111-120.

Rosenberg L. B., "Virtual fixtures: Perceptual tools for telerobotic manipulation." In Virtual Reality Annual International Symposium, 1993, 1993 IEEE, vol., no., pp.76-82, 10.1109/VRAIS.1993.380795

Mylonas G. P, Kwok K.-W., James D. R. C. et al., “Gaze-Contingent Motor Channelling, haptic constraints and associated cognitive demand for robotic MIS,” Medical Image Analysis, vol. 16, no. 3, pp. 612-631, 2012.

James D. R., Leff D. R., Orihuela-Espina F. et al., “Enhanced frontoparietal network architectures following “gaze-contingent” versus “free-hand” motor learning,” NeuroImage, vol. 64, pp. 267-276, 2013.

Stark M., Benhidjeb T., Gidaro S. et al., “The future of telesurgery: a universal system with haptic sensation,” Journal of the Turkish German Gynecological Association, vol. 13, no. 1, pp. 74, 2012.

Despinoy F., Leon Torres J., Vitrani M.-A. et al., "Toward Remote Teleoperation with Eye and Hand: A First Experimental Study.", 2013. Online accessed Nov. 1, 2014

Horng W.-B., Chen C.-Y., Chang Y. et al., "Driver fatigue detection based on eye tracking and dynamk, template matching." in Proceedings of the IEEE International Conference on Networking, Sensing and Control, pp. 7–12, Taipei, Taiwan, 2004.

Rouzier B., Murakami T., "Gaze detection based driver modelization in an electric vehicle using virtual force field and Steer by Wire system." Advanced Motion Control (AMC),2014 IEEE 13th International Workshop on , vol., no., pp.350,355, 2014

Kangas J., Akkil D., Rantala J. et al., "Gaze gestures and haptic feedback in mobile devices." In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 435-438.2014

Kessler G. D., Hodges L. F., Walker N., “Evaluation of the CyberGlove as a whole-hand input device,” ACM Trans. Comput.-Hum. Interact., vol. 2, no. 4, pp. 263-283, 1995.

Ertan S., Lee C., Willets A. et al., "A wearable haptic navigation guidance system." In Digest of the Second International Symposium on Wearable Computers. IEEE Computer Society; Washington, DC: 1998 pp. 164-165.}

Minamizawa K., Fukamachi S., Kajimoto H et al., "Gravity grabber: wearable haptic display to present virtual mass sensation." In ACM SIGGRAPH 2007 emerging technologies (SIGGRAPH '07). ACM, New York, NY, USA, , Article 8,2007 .

Rantala J., Kangas J., Akkil D. et al., “Glasses with haptic feedback of gaze gestures,” in CHI '14 Extended Abstracts on Human Factors in Computing Systems, Toronto, Ontario, Canada, 2014, pp. 1597-1602.2014

Agustin J. S., Mateo J. C., Hansen J. P. et al., “Evaluation of the Potential of Gaze Input for Game Interaction,” PsychNology Journal, vol. 7, no. 2, 2009.

Jacob R. J., "What you look at is what you get: eye movement-based interaction techniques." Proc. ACM CHI'90 Human Factors in Computing Systems Conference, pp. 11-18, Addison-Wesley/ACM Press, pp. 11-18. 1990

Kumar M., Paepcke A., Winograd T., "Eyepoint: practical pointing and selection using gaze and keyboard." In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 421-430.2007

Zhai S., Morimoto C., Ihde S., "Manual and gaze input cascaded (MAGIC) pointing." In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 246-253. 1999

Fares R., Fang S., Komogortsev O., “Can we beat the mouse with MAGIC?,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 2013, pp. 1387-1390.

Stellmach S., Dachselt R., "Look & touch: gaze-supported target acquisition." In Proceedings of the Conference on Human Factors in Computing Systems (CHI 2012) Austin, TX, USA, pp. 2981-2990, 2012.

Stellmach S., Dachselt R., "Still looking: Investigating seamless gaze-supported selection, positioning, and manipulation of distant targets." In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, pp.285-294.2013

Turner J., Bulling A., Alexander J. et al., "Cross-device gaze-supported point-to-point content transfer." In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 19-26. 2013

De Boeck J., Raymaekers C., Coninx K., "Are existing metaphors in virtual environments suitable for haptic interaction." pp. 261-268. Proceedings of the 7th International Conference on Virtual Reality, pp. 261-268, 2005.

Otaduy M. A., Lin M. C., “User-centric viewpoint computation for haptic exploration and manipulation,” in Proceedings of the conference on Visualization '01, San Diego, California, pp. 311-318. 2001.

Otaduy M. A., Lin M. C., “Introduction to haptic rendering,” in ACM SIGGRAPH 2005 Courses, Los Angeles, California, pp. 3. 2005.

O'Sullivan C., Dingliana J., “Collisions and perception,” ACM Trans. Graph., vol. 20, no. 3, pp. 151-168, 2001.

Henderson J. M., Shinkareva S. V., Wang J. et al., “Predicting Cognitive State from Eye Movements,” PloS one, vol. 8, no. 5, pp. e64937, 2013.

Doshi A., Trivedi M. M., “On the roles of eye gaze and head dynamics in predicting driver's intent to change lanes,” Intelligent Transportation Systems, IEEE Transactions on, vol. 10, no. 3, pp. 453-462, 2009.

Kersten D., Yuille A., “Bayesian models of object perception,” Current opinion in neurobiology, vol. 13, no. 2, pp. 150-158, 2003.

Ferreira J. F., Lobo J., a Dias J., “Bayesian real-time perception algorithms on GPU,” Journal of Real-Time Image Processing, vol. 6, no. 3, pp. 171-186, 2011.

Otaduy M. A., Garre C., Lin M. C., “Representations and Algorithms for Force-Feedback Display,” Proceedings of the IEEE, vol. 101, no. 9, pp. 2068-2080, 2013.

Saunders D. R., Woods R. L., “Direct measurement of the system latency of gaze-contingent displays,” Behavior research methods, pp. 1-9, 2013.

Han P., Saunders D. R., Woods R. L. et al., “Trajectory prediction of saccadic eye movements using a compressed exponential model,” Journal of Vision, vol. 13, no. 8, pp. 27, 2013.

Liu G., Lu K., “Networked multiplayer cooperative interaction using decoupled motion control method in a shared virtual environment with haptic, visual and movement?feedback,” Computer Animation and Virtual Worlds, pp. 97-109, 2012.

Ritterfeld U., Shen C., Wang H. et al., “Multimodality and interactivity: Connecting properties of serious games with educational outcomes,” Cyberpsychology & Behavior, vol. 12, no. 6, pp. 691-697, 2009.

Ernst M. O., Banks M. S, “Humans integrate visual and haptic information in a statistically optimal fashion,” Nature, vol. 415, no. 6870, pp. 429-433, 2002.




How to Cite

Deng, S., Kirkby, J. A., Chang, J., & Zhang, J. J. (2014). Multimodality with Eye tracking and Haptics: A New Horizon for Serious Games?. International Journal of Serious Games, 1(4).