Multimodality with Eye tracking and Haptics: A New Horizon for Serious Games?
DOI:
https://doi.org/10.17083/ijsg.v1i4.24Keywords:
Eye tracking, Haptic, Multimodal integration, Virtual reality, Serious gamesAbstract
The goal of this review is to illustrate the emerging use of multimodal virtual reality that can benefit learning-based games. The review begins with an introduction to multimodal virtual reality in serious games and we provide a brief discussion of why cognitive processes involved in learning and training are enhanced under immersive virtual environments. We initially outline studies that have used eye tracking and haptic feedback independently in serious games, and then review some innovative applications that have already combined eye tracking and haptic devices in order to provide applicable multimodal frameworks for learning-based games. Finally, some general conclusions are identified and clarified in order to advance current understanding in multimodal serious game production as well as exploring possible areas for new applications.
References
Zyda M., “From visual simulation to virtual reality to games,” Computer, vol. 38, no. 9, pp. 25-32, 2005. http://dx.doi.org/10.1109/MC.2005.297
De Freitas S., “Learning in immersive worlds,” London: Joint Information Systems Committee, 2006.
Moreno R., Mayer R., “Interactive Multimodal Learning Environments,” Educational Psychology Review, vol. 19, no. 3, pp. 309-326, 2007. http://dx.doi.org/10.1007/s10648-007-9047-2
Guo J., Guo A., “Crossmodal interactions between olfactory and visual learning in Drosophila,” Science, vol. 309, no. 5732, pp. 307-310, 2005. http://dx.doi.org/10.1126/science.1111280
Sundar S. S., “The MAIN model: A heuristic approach to understanding technology effects on credibility,”. In Miriam J. Metzger and Andrew J. Flanagin (Eds.) Digital media, youth, and credibility, pp. 73-100, Cambridge, MA: MIT Press, 2008. http://dx.doi.org/10.1162/dmal.9780262562324.073
Sun P.-C., Cheng H. K., “The design of instructional multimedia in e-Learning: A Media Richness Theory-based approach,” Computers & Education, vol. 49, no. 3, pp. 662-676, 2007. http://dx.doi.org/10.1016/j.compedu.2005.11.016
Jaimes A, Sebe N., “Multimodal human–computer interaction: A survey,” Computer Vision and Image Understanding, vol. 108, no. 1–2, pp. 116-134, 2007. http://dx.doi.org/10.1016/j.cviu.2006.10.019
Burdea G., Richard P., Coiffet P., “Multimodal virtual reality: Input?output devices, system integration, and human factors,” International Journal of Human-Computer Interaction, vol. 8, no. 1, pp. 5-24, 1996. http://dx.doi.org/10.1080/10447319609526138
Duchowski A., “A breadth-first survey of eye-tracking applications,” Behavior Research Methods, Instruments, & Computers, vol. 34, no. 4, pp. 455-470, 2002. http://dx.doi.org/10.3758/BF03195475
Coles T. R., Meglan D., John N. W., “The role of haptics in medical training simulators: a survey of the state of the art,” Haptics, IEEE Transactions on, vol. 4, no. 1, pp. 51-66, 2011. http://dx.doi.org/10.1109/TOH.2010.19 http://dx.doi.org/10.1109/TOH.2010.19
Xu Z., Yu H., Yan S., "Motor rehabilitation training after stroke using haptic handwriting and games." Proceedings of the 4th International Convention on Rehabilitation Engineering & Assistive Technology, 31. Singapore Therapeutic, Assistive & Rehabilitative Technologies (START) Centre, p. 31:1–31:4, 2010.
Broeren J., Rydmark M., Sunnerhagen K. S., “Virtual reality and haptics as a training device for movement rehabilitation after stroke: a single-case study,” Archives of physical medicine and rehabilitation, vol. 85, no. 8, pp. 1247-1250, 2004. http://dx.doi.org/10.1016/j.apmr.2003.09.020
Veneman J. F., Jung J. H., Perry J. C. et al., "Consistent Arm Rehabilitation from Clinical to Home Environment-Integrating the Universal Haptic Drive into the TeleReha Software Platform," Converging Clinical and Engineering Research on Neurorehabilitation, pp. 1013-1017: Springer, 2013. http://dx.doi.org/10.1007/978-3-642-34546-3_166
Krenek A., Cernohorsky M., Kabelác Z.., “Haptic visualization of molecular model,” In SkalaV. (Eds) WSCG’99 Conference Proceedings, 1999.
Persson P. B, Cooper M. D., Tibell L. A. E. et al., "Designing and Evaluating a Haptic System for Biomolecular Education." In: IEEE Virtual Reality Conference, pp. 171–178, 2007. http://dx.doi.org/10.1109/VR.2007.352478
Rayner K., “Eye movements in reading and information processing: 20 years of research,” Psychological bulletin, vol. 124, no. 3, pp. 372-422, 1998. http://dx.doi.org/10.1037/0033-2909.124.3.372
Rayner K., “Eye movements and attention in reading, scene perception, and visual search,” The quarterly journal of experimental psychology, vol. 62, no. 8, pp. 1457-1506, 2009. http://dx.doi.org/10.1080/17470210902816461
Liversedge S. P., Findlay J. M., “Saccadic eye movements and cognition,” Trends in Cognitive Sciences, vol. 4, no. 1, pp. 6-14, 2000. http://dx.doi.org/10.1016/S1364-6613(99)01418-7
Granholm E., Steinhauer S. R, “Pupillometric measures of cognitive and emotional processes,” International Journal of Psychophysiology, vol. 52, no. 1, pp. 1-6, 2004. http://dx.doi.org/10.1016/j.ijpsycho.2003.12.001
Palinko O., Kun A. L., Shyrokov A. et al., "Estimating cognitive load using remote eye tracking in a driving simulator." in Proceedings of the Symposium on Eye-Tracking Research and Applications (ETRA '10), pp. 141–144, 2010. http://dx.doi.org/10.1145/1743666.1743701
Schultheis H., Jameson A., "Assessing cognitive load in adaptive hypermedia systems: Physiological and behavioral methods.".In: De Bra, P.M.E., Nejdl, W. (eds.) Adaptive hypermedia and adaptive web based system. LNCS, vol. 3137,pp.225–234, Berlin, Springer, 2004. http://dx.doi.org/10.1007/978-3-540-27780-4_26
Smith J. D., Graham T., "Use of eye movements for video game control." In: Proceedings of the 2006 ACM SIGCHI International Conference on Advances in Computer Entertainment Technology, p. 20. New York: ACM Press, 2006. http://dx.doi.org/10.1145/1178823.1178847
Knoepfle D. T., Wang J. T. y., Camerer C. F., “Studying Learning in Games Using Eye?tracking,” Journal of the European Economic Association, vol. 7, no. 2?3, pp. 388-398, 2009. http://dx.doi.org/10.1162/JEEA.2009.7.2-3.388
Kiili K., Ketamo H., Kickmeier-Rust M. D., “Evaluating the usefulness of Eye Tracking in Game-based Learning,” International Journal of Serious Games, vol. 1, no. 2, 2014.
Conati C., Merten C., “Eye-tracking for user modeling in exploratory learning environments: An empirical evaluation,” Knowledge-Based Systems, vol. 20, no. 6, pp. 557-574, 2007. http://dx.doi.org/10.1016/j.knosys.2007.04.010
Williams C., Henderson J., Zacks R. “Incidental visual memory for targets and distractors in visual search,” Perception & Psychophysics, vol. 67, no. 5, pp. 816-827, 2005. http://dx.doi.org/10.3758/BF03193535
Vlaskamp B. N., Hooge I. T. C., “Crowding degrades saccadic search performance,” Vision research, vol. 46, no. 3, pp. 417-425, 2006. http://dx.doi.org/10.1016/j.visres.2005.04.006
Zelinsky, G. J. “Specifying the components of attention in a visual search task,” Neurobiology of attention, pp. 395-400, 2005.
Pfeiffer T., Latoschik M. E, Wachsmuth I., “Evaluation of binocular eye trackers and algorithms for 3D gaze interaction in virtual reality environments,” JVRB-Journal of Virtual Reality and Broadcasting, vol. 5, no. 16, 2008.
Paletta L., Santner K., Fritz G. et al., "3d attention: measurement of visual saliency using eye tracking glasses." In CHI '13 Extended Abstracts on Human Factors in Computing Systems, ACM, pp.199-204, 2013. http://dx.doi.org/10.1145/2468356.2468393
Alt F., Schneegass S., Auda J. et al., "Using eye-tracking to support interaction with layered 3D interfaces on stereoscopic displays." in Proceedings of the 19\textsuperscriptth international conference on Intelligent User Interfaces – IUI ’14, New York, New York, USA, pp. 267-272, 2014 http://dl.acm.org/citation.cfm?doid=2557500.2557518
Reingold E. M., Loschky L. C., McConkie G. W. et al., “Gaze-contingent multiresolutionaldisplays: An integrative review,” Human Factors: The Journal of the Human Factors and Ergonomics Society, vol. 45, no. 2, pp. 307-328, 2003. http://dx.doi.org/10.1518/hfes.45.2.307.27235
Duchowski A. T., Cournia N., and Murphy H., “Gaze-Contingent displays: A review,” CyberPsychology & Behavior, vol. 7, no. 6, pp. 621-634, 2004. http://dx.doi.org/10.1089/cpb.2004.7.621
Posner M. I., “Orienting of attention,” Quarterly journal of experimental psychology, vol. 32, no. 1, pp. 3-25, 1980. http://dx.doi.org/10.1080/00335558008248231
Sundstedt V., Stavrakis E., Wimmer M. et al., “A psychophysical study of fixation behavior in a computer game,” in Proceedings of the 5th symposium on Applied perception in graphics and visualization, Los Angeles, California, 2008, pp. 43-50. http://dx.doi.org/10.1145/1394281.1394288
Kawashima T., Terashima T., Nagasaki T. et al., "Enhancing visual perception using dynamic updating of display," Intuitive Human Interfaces for Organizing and Accessing Intellectual Assets, pp. 127-141: Springer, 2005. http://dx.doi.org/10.1007/978-3-540-32279-5_9
Schumacher J., Allison R., Herpers R., "Using saccadic suppression to hide graphic updates." In 10th Eurographics Symposium on Virtual Environments, pp. 17-24, 2004. http://dx.doi.org/10.2312/EGVE/EGVE04/017-024
Franke I. S., Günther T., Groh R., "Saccade Detection and Processing for Enhancing 3D Visualizations in Real-Time," HCI International 2014-Posters’ Extended Abstracts, pp. 317-322: Springer, 2014. http://dx.doi.org/10.1007/978-3-319-07857-1_56
Watanabe J., Ando H., Maeda T. et al., “Gaze-contingent visual presentation based on remote saccade detection,” Presence: Teleoperators and Virtual Environments, vol. 16, no. 2, pp. 224-234, 2007. http://dx.doi.org/10.1162/pres.16.2.224
Watanabe J, Maeda T, Ando H, “Gaze-contingent visual presentation technique with electro-ocular-graph-based saccade detection,” ACM Transactions on Applied Perception (TAP), vol. 9, no. 2, pp. 6, 2012. http://dx.doi.org/10.1145/2207216.2207217
Triesch J., Sullivan B. T., M. M. Hayhoe et al., "Saccade contingent updating in virtual reality." Proceedings of the Eye Tracking Research and Applications Symposium 2002. pp. 95-102., New York, NY: ACM; 2002. http://dx.doi.org/10.1145/507072.507092
Hillaire S., Lécuyer A., Cozot R., "Using an eye-tracking system to improve camera motions and depth-of-field blur effects in virtual environments." Virtual Reality Conference, 2008. VR '08. IEEE, vol., no., pp.47,50, 8-12 March 2008
http://dx.doi.org/10.1109/VR.2008.4480749
Burelli P., Yannakakis G. N., "Towards adaptive virtual camera control in computer games." In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics).vol. 6815 LNCS, pp. 25-36. Springer. 2011
http://dx.doi.org/10.1007/978-3-642-22571-0_3
Zhu D., Gedeon T., Taylor K., ““Moving to the centre”: A gaze-driven remote camera control for teleoperation,” Interacting with Computers, vol. 23, no. 1, pp. 85-95, 2011. http://dx.doi.org/10.1016/j.intcom.2010.10.003
Murphy H. A., Duchowski A. T., Tyrrell R. A., “Hybrid image/model-based gaze-contingent rendering,” ACM Trans. Appl. Percept., vol. 5, no. 4, pp. 1-21, 2009. http://dx.doi.org/10.1145/1462048.1462053
Kincaid, J. P., & Westerlund, K. K. (2009). "Simulation in education and training". Proceedings of the 2009 Winter Simulation Conference (WSC), pp. 273–280. http://dx.doi.org/10.1109/WSC.2009.5429337
Derryberry, A. “Serious games: online games for learning,” Adobe Whitepaper, November, 2007.
Okamura A. M., Richard C., Cutkosky M. R., “Feeling is believing: Using a force-feedback joystick to teach dynamic systems,” Journal of Engineering Education-Washington, vol. 91, no. 3, pp. 345-350, 2002.
http://dx.doi.org/10.1002/j.2168-9830.2002.tb00713.x
Chan M. S., Black J. B., "Learning Newtonian mechanics with an animation game: The role of presentation format on mental model acquisition.".Annual Meeting of the American Educational Research Association (AERA), San Francisco. 2006
Chui C.-K., Ong J. S., Lian Z.-Y. et al., “Haptics in computer-mediated simulation: Training in vertebroplasty surgery,” Simulation & Gaming, vol. 37, no. 4, pp. 438-451, 2006. http://dx.doi.org/10.1177/1046878106291667
De Paolis L. T., "Serious Game for Laparoscopic Suturing Training." pp. 481-485. http://dx.doi.org/10.1109/CISIS.2012.175
Jing Q., Yim-Pan C., Wai-Man P. et al., “Learning Blood Management in Orthopedic Surgery through Gameplay,” Computer Graphics and Applications, IEEE, vol. 30, no. 2, pp. 45-57, 2010. http://dx.doi.org/10.1109/MCG.2009.83
Goude D., Björk S., Rydmark M., “Game design in virtual reality systems for stroke rehabilitation,” Studies in health technology and informatics, vol. 125, no. 2007, pp. 146-148, 2007.
Delbressine F., Timmermans A., Beursgens L. et al., "Motivating arm-hand use for stroke patients by serious games." in Proceedings of the International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3564–3567, 2012 http://dx.doi.org/10.1109/EMBC.2012.6346736.
De Weyer T., Robert K., Renny Octavia Hariandja J. et al., “The Social Maze: A Collaborative Game to Motivate MS Patients for Upper Limb Training,” In Herrlich M., Malaka R., Masuch M Entertainment Computing-ICEC 2012, LNCS vol. 7522, pp. 476-479, Berlin, Springer, 2012. http://dx.doi.org/10.1007/978-3-642-33542-6_57
Pernalete N., Edwards S., Gottipati R. et al., "Eye-hand coordination assessment/therapy using a robotic haptic device." 9th International Conference on Rehabilitation Robotics, ICORR; 2005; Chicago, Illinois. pp. 25–28, 2005 http://dx.doi.org/10.1109/ICORR.2005.1501043
Pernalete N., Tang F., Chang S. M. et al., "Development of an evaluation function for eye-hand coordination robotic therapy." Development of an evaluation function for eye-hand coordination robotic therapy. In: 2011 IEEE International Conference on Rehabilitation Robotics (ICORR), pp. 1–6 , 2011. http://dx.doi.org/10.1109/ICORR.2011.5975423
Yuan B., Folmer E., “Blind hero: enabling guitar hero for the visually impaired,” in Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility, Halifax, Nova Scotia, Canada, 2008, pp. 169-176. http://dx.doi.org/10.1145/1414471.1414503
Nemec V., Sporka A., Slavik P., "Haptic and Spatial Audio Based Navigation of Visually Impaired Users in Virtual Environment Using Low Cost Devices," User-Centered Interaction Paradigms for Universal Access in the Information Society, Lecture Notes in Computer Science C. Stary and C. Stephanidis, eds., pp. 452-459: Springer Berlin Heidelberg, 2004. http://dx.doi.org/10.1007/978-3-540-30111-0_39
Yuan B., Folmer E., Harris, Jr. F., “Game accessibility: a survey,” Universal Access in the Information Society, vol. 10, no. 1, pp. 81-100, 2011. http://dx.doi.org/10.1007/s10209-010-0189-5
Hayward V., Astley O. R., Cruz-Hernandez M. et al., “Haptic interfaces and devices,” Sensor Review, vol. 24, no. 1, pp. 16-29, 2004. http://dx.doi.org/10.1108/02602280410515770
"Mice that work—and work wonders," http://www.microsoft.com/hardware/en-gb/touch-technology.
Schneider C., Mustufa T., Okamura A., “A magnetically-actuated friction feedback mouse,” Proceedings of EuroHaptics 2004, Munich, Germany, pp. 330-337, 2004.
Mackenzie I. S., “Movement characteristics using a mouse with tactile and force feedback,” Int. J. Human–Computer Studies, vol. 45, pp. 483-493, 1996. http://dx.doi.org/10.1006/ijhc.1996.0063
Wanjoo P., Sehyung P., Laehyun K. et al., "Haptic Mouse Interface Actuated by an Electromagnet." In: 2011 International Conference on Complex, Intelligent and Software Intensive Systems (CISIS), pp. 643–646, 2011. http://dx.doi.org/10.1109/CISIS.2011.107
Orozco M., Silva J., El Saddik A. et al., “The Role of Haptics in Games,” Haptics Rendering and Applications, Abdulmotaleb El Saddik (Ed), 978-953-307-897-7, InTech pp. 978-953, 2012. http://cdn.intechopen.com/pdfs-wm/26941.pdf
T. Games. http://tngames.com/.
Palan S., "Tactile Gaming Vest (TGV)," 2010. Online at http://iroboticist.com/2010/03/26/tgv/ accessed Oct. 31, 2014
Mohellebi H., Kheddar A., Espie S., “Adaptive Haptic Feedback Steering Wheel for Driving Simulators,” Vehicular Technology, IEEE Transactions on, vol. 58, no. 4, pp. 1654-1666, 2009. http://dx.doi.org/10.1109/TVT.2008.2004493
Hwang S., Ryu J.-H., "The Haptic steering Wheel: Vibro-tactile based navigation for the driving environment." pp. 660-665. http://dx.doi.org/10.1109/PERCOMW.2010.5470517
"OpenHaptics Toolkit Datasheet," http://www.sensable.com/documents/documents/OpenHaptics_datasheet_hi.pdf.
Conti F., Barbagli F., Balaniuk R. et al., "The CHAI libraries." In: Proceedings of Eurohaptics 2003, pp. 496–500, 2003
"H3DAPI Datasheet," http://www.sensegraphics.com/datasheet/H3DAPI_datasheet.pdf.
"Reachin API," http://www.reachin.se/products/ReachinAPI/.
ReachinTechnologies. "HaptX," http://www.haptx.com.
De Pascale M., Prattichizzo D., “The Haptik Library: A Component Based Architecture for Uniform Access to Haptic Devices,” Robotics & Automation Magazine, IEEE, vol. 14, no. 4, pp. 64-75, 2007. http://dx.doi.org/10.1109/M-RA.2007.905747
Frisoli A., Loconsole C., Leonardis D. et al., “A New Gaze-BCI-Driven Control of an Upper Limb Exoskeleton for Rehabilitation in Real-World Tasks,” Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, vol. 42, no. 6, pp. 1169-1179, 2012.
http://dx.doi.org/10.1109/TSMCC.2012.2226444
Troncossi M., Mozaffari Foumashi M., Mazzotti C. et al., “Design and Manufacturing of a Hand-and-Wrist Exoskeleton Prototype for the Rehabilitation of Post-Stroke Patients,” in Quaderni del DIEM–GMA. Atti della Sesta Giornata di Studio Ettore Funaioli, 2012, pp. 111-120.
Rosenberg L. B., "Virtual fixtures: Perceptual tools for telerobotic manipulation." In Virtual Reality Annual International Symposium, 1993, 1993 IEEE, vol., no., pp.76-82,
http://dx.doi.org 10.1109/VRAIS.1993.380795
Mylonas G. P, Kwok K.-W., James D. R. C. et al., “Gaze-Contingent Motor Channelling, haptic constraints and associated cognitive demand for robotic MIS,” Medical Image Analysis, vol. 16, no. 3, pp. 612-631, 2012. http://dx.doi.org/10.1016/j.media.2010.07.007
James D. R., Leff D. R., Orihuela-Espina F. et al., “Enhanced frontoparietal network architectures following “gaze-contingent” versus “free-hand” motor learning,” NeuroImage, vol. 64, pp. 267-276, 2013. http://dx.doi.org/10.1016/j.neuroimage.2012.08.056
Stark M., Benhidjeb T., Gidaro S. et al., “The future of telesurgery: a universal system with haptic sensation,” Journal of the Turkish German Gynecological Association, vol. 13, no. 1, pp. 74, 2012. http://dx.doi.org/10.5152/jtgga.2012.05
Despinoy F., Leon Torres J., Vitrani M.-A. et al., "Toward Remote Teleoperation with Eye and Hand: A First Experimental Study.", 2013. Online http://www.cascade-fp7.eu/cras2013/proceedings/cras2013_Despinoy.pdf accessed Nov. 1, 2014
Horng W.-B., Chen C.-Y., Chang Y. et al., "Driver fatigue detection based on eye tracking and dynamk, template matching." in Proceedings of the IEEE International Conference on Networking, Sensing and Control, pp. 7–12, Taipei, Taiwan, 2004.
Rouzier B., Murakami T., "Gaze detection based driver modelization in an electric vehicle using virtual force field and Steer by Wire system." Advanced Motion Control (AMC),2014 IEEE 13th International Workshop on , vol., no., pp.350,355, 2014
http://dx.doi.org/10.1109/AMC.2014.6823307.
Kangas J., Akkil D., Rantala J. et al., "Gaze gestures and haptic feedback in mobile devices." In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '14). ACM, New York, NY, USA, 435-438.2014 http://doi.acm.org/10.1145/2556288.2557040
Kessler G. D., Hodges L. F., Walker N., “Evaluation of the CyberGlove as a whole-hand input device,” ACM Trans. Comput.-Hum. Interact., vol. 2, no. 4, pp. 263-283, 1995. http://dx.doi.org/10.1145/212430.212431
Ertan S., Lee C., Willets A. et al., "A wearable haptic navigation guidance system." In Digest of the Second International Symposium on Wearable Computers. IEEE Computer Society; Washington, DC: 1998 pp. 164-165. http://dx.doi.org/10.1109/ISWC.1998.729547}
Minamizawa K., Fukamachi S., Kajimoto H et al., "Gravity grabber: wearable haptic display to present virtual mass sensation." In ACM SIGGRAPH 2007 emerging technologies (SIGGRAPH '07). ACM, New York, NY, USA, , Article 8,2007 .
http://doi.acm.org/10.1145/1278280.1278289
Rantala J., Kangas J., Akkil D. et al., “Glasses with haptic feedback of gaze gestures,” in CHI '14 Extended Abstracts on Human Factors in Computing Systems, Toronto, Ontario, Canada, 2014, pp. 1597-1602.2014 http://dl.acm.org/citation.cfm?id=2557040
Agustin J. S., Mateo J. C., Hansen J. P. et al., “Evaluation of the Potential of Gaze Input for Game Interaction,” PsychNology Journal, vol. 7, no. 2, 2009.
Jacob R. J., "What you look at is what you get: eye movement-based interaction techniques." Proc. ACM CHI'90 Human Factors in Computing Systems Conference, pp. 11-18, Addison-Wesley/ACM Press, pp. 11-18. 1990 http://dx.doi.org/10.1145/97243.97246
Kumar M., Paepcke A., Winograd T., "Eyepoint: practical pointing and selection using gaze and keyboard." In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '07). ACM, New York, NY, USA, 421-430.2007 http://doi.acm.org/10.1145/1240624.1240692p
Zhai S., Morimoto C., Ihde S., "Manual and gaze input cascaded (MAGIC) pointing." In Proceedings of the SIGCHI conference on Human Factors in Computing Systems (CHI '99). ACM, New York, NY, USA, 246-253. 1999 http://doi.acm.org/10.1145/302979.303053
Fares R., Fang S., Komogortsev O., “Can we beat the mouse with MAGIC?,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Paris, France, 2013, pp. 1387-1390. http://dx.doi.org/10.1145/2470654.2466183
Stellmach S., Dachselt R., "Look & touch: gaze-supported target acquisition." In Proceedings of the Conference on Human Factors in Computing Systems (CHI 2012) Austin, TX, USA, pp. 2981-2990, 2012. http://dx.doi.org/10.1145/2207676.2208709
Stellmach S., Dachselt R., "Still looking: Investigating seamless gaze-supported selection, positioning, and manipulation of distant targets." In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '13). ACM, New York, NY, USA, pp.285-294.2013 http://doi.acm.org/10.1145/2470654.2470695.
Turner J., Bulling A., Alexander J. et al., "Cross-device gaze-supported point-to-point content transfer." In Proceedings of the Symposium on Eye Tracking Research and Applications (ETRA '14). ACM, New York, NY, USA, 19-26. 2013
DOI=10.1145/2578153.2578155 http://doi.acm.org/10.1145/2578153.2578155
De Boeck J., Raymaekers C., Coninx K., "Are existing metaphors in virtual environments suitable for haptic interaction." pp. 261-268. Proceedings of the 7th International Conference on Virtual Reality, pp. 261-268, 2005.
Otaduy M. A., Lin M. C., “User-centric viewpoint computation for haptic exploration and manipulation,” in Proceedings of the conference on Visualization '01, San Diego, California, pp. 311-318. 2001. http://dx.doi.org/10.1109/VISUAL.2001.964526
Otaduy M. A., Lin M. C., “Introduction to haptic rendering,” in ACM SIGGRAPH 2005 Courses, Los Angeles, California, pp. 3. 2005. http://dx.doi.org/10.1145/1198555.1198603
O'Sullivan C., Dingliana J., “Collisions and perception,” ACM Trans. Graph., vol. 20, no. 3, pp. 151-168, 2001. http://dx.doi.org/10.1145/501786.501788
Henderson J. M., Shinkareva S. V., Wang J. et al., “Predicting Cognitive State from Eye Movements,” PloS one, vol. 8, no. 5, pp. e64937, 2013. http://dx.doi.org/10.1371/journal.pone.0064937
Doshi A., Trivedi M. M., “On the roles of eye gaze and head dynamics in predicting driver's intent to change lanes,” Intelligent Transportation Systems, IEEE Transactions on, vol. 10, no. 3, pp. 453-462, 2009. http://dx.doi.org/10.1109/TITS.2009.2026675
Kersten D., Yuille A., “Bayesian models of object perception,” Current opinion in neurobiology, vol. 13, no. 2, pp. 150-158, 2003. http://dx.doi.org/10.1016/S0959-4388(03)00042-4
Ferreira J. F., Lobo J., a Dias J., “Bayesian real-time perception algorithms on GPU,” Journal of Real-Time Image Processing, vol. 6, no. 3, pp. 171-186, 2011. http://dx.doi.org/10.1007/s11554-010-0156-7
Otaduy M. A., Garre C., Lin M. C., “Representations and Algorithms for Force-Feedback Display,” Proceedings of the IEEE, vol. 101, no. 9, pp. 2068-2080, 2013. http://dx.doi.org/10.1109/JPROC.2013.2246131
Saunders D. R., Woods R. L., “Direct measurement of the system latency of gaze-contingent displays,” Behavior research methods, pp. 1-9, 2013.
Han P., Saunders D. R., Woods R. L. et al., “Trajectory prediction of saccadic eye movements using a compressed exponential model,” Journal of Vision, vol. 13, no. 8, pp. 27, 2013. http://dx.doi.org/10.1167/13.8.27
Liu G., Lu K., “Networked multiplayer cooperative interaction using decoupled motion control method in a shared virtual environment with haptic, visual and movement?feedback,” Computer Animation and Virtual Worlds, pp. 97-109, 2012. http://dx.doi.org/10.1089/cpb.2009.0099
Ritterfeld U., Shen C., Wang H. et al., “Multimodality and interactivity: Connecting properties of serious games with educational outcomes,” Cyberpsychology & Behavior, vol. 12, no. 6, pp. 691-697, 2009.
Ernst M. O., Banks M. S, “Humans integrate visual and haptic information in a statistically optimal fashion,” Nature, vol. 415, no. 6870, pp. 429-433, 2002. http://dx.doi.org/10.1038/415429a
Downloads
Published
Issue
Section
License
IJSG copyright information is provided here.