Use of Ecological Gestures in Soccer Games Running on Mobile Devices
DOI:
https://doi.org/10.17083/ijsg.v1i4.48Keywords:
ecological interactions, gesture recognition, mobile serious gamesAbstract
The strong integration of “intelligent mobile devices” into modern societies offers a great potential for a wide spread distribution of mobile serious games. As in the case of Virtual Reality based systems, in order to be useful and efficient, these serious games need to be validated ecologically. In this context, this paper addresses the use of ecological interactions for a mobile serious game. We exploit a wearable insole in order to let users interact with a virtual soccer game via real-world soccer movements. We analyzed the concept of ecological interactions. The system used for recognition of ecological gestures is also detailed. A primary study showed that proposed system can be exploited for real time gesture recognition on a mobile device.
References
[2] Müller, H., Gove, J. and Webb, J. Understanding tablet use: a multi-method exploration. (2012) In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services (MobileHCI '12), (pp. 1-10).
[3] Bohil, C. J. ,Alicea, B. and Biocca, F. A. (2011) Virtual reality in neuroscience research and therapy. Nature Reviews Neuroscience, 12 (12), 752-762.
[4] Bouchard, B., Imbeault, F., Bouzouane, A. and Menelas, B-A J., (2012), Developing serious games specifically adapted to people suffering from Alzheimer. In Proceedings of the Third international Conference on Serious Games Development and Applications. Springer-Verlag, Berlin, Heidelberg, 243-254.
[5] Gibson, J. J. (1986), The Ecological Approach to Visual Perception, Hillsdale, NJ, Lawrence Erlbaum Associates, Inc. (Original work published in 1979)
[6] Bourgault, N., Bouchard, B., and Menelas, B.-A. J. (2014). Effect of Ecological Gestures on the Immersion of the Player in a Serious Game. In Serious Games Development and Applications (pp. 21-33). Springer International Publishing.
[7] Katzourin, M., Ignatoff, D., Quirk, L., Laviola, J., Jenkins, O.C.: Swordplay: Innovating game development through VR. Computer Graphics and Applications, IEEE 26, 15-19 (2006)
[8] Burke, J.W., Mcneill, M., Charles, D.K., Morrow, P.J., Crosbie, J.H., Mcdonough, S.M.: Optimising engagement for stroke rehabilitation using serious games. The Visual Computer 25, 1085-1099 (2009)
[9] Nicolas, F. and Claudel, F., Interactive step-type gymnastics practice device, patent pending US 7722501
[10] Pearson, M. S., Board sport simulator and training device patent pending US 7488177
[11] Boyle, E., Kennedy, A.-M., Traynor, O., Hill, A.D.: Training Surgical Skills Using Nonsurgical Tasks—Can Nintendo Wii™ Improve Surgical Performance? Journal of surgical education 68, 148-154 (2011)
[12] Fernandez, L., Gueguen, N., Montagne, G., Rao, G., Berton, E., and Bootsma, R.J. A VR approach to RR behaviour: goalkeeping in football, Laval Virtual, Laval (France), April18-20 2007
[13] Zhang, X., Chen, X., Li, Y., Lantz, V., Wang, K., and Yang, J. (2011). A framework for hand gesture recognition based on accelerometer and EMG sensors. Systems, Man and Cybernetics, Part A: Systems and Humans, IEEE Transactions on, 41(6), 1064-1076.
[14] Alexander, J., Han, T., Judd, W., Irani, P. and Subramanian, S. (2012) Putting your best foot forward: investigating real-world mappings for foot-based gestures, in CHI, J. A. Konstan, E. H. Chi, and K. Hook, Eds. ACM, 2012, (pp. 1229–1238).
[15] Han, T., Alexander, J., Karnik, A., Irani, P. and Subramanian, S. (2011). Kick: investigating the use of kick gestures for mobile interactions. Proceedings of the th International Conference on Human Computer Interaction with Mobile Devices and Services. ACM: 29-32.
[16] Bailly, G., Muller, J., Rohs, M., Wigdor, D. and Kratz, S. (2012) Shoesense: a new perspective on gestural interaction and wearable applications. In Proceedings of the ACM annual conference on Human Factors in Computing Systems, CHI ’12, (pp. 1239–1248).
[17] Scott, J., Dearman, D., Yatani, K. and Truong, K. N. Sensing foot gestures from the pocket. (2010) In Proceedings of the 23nd annual ACM symposium on User interface software and technology, ser. UIST ’10. New York, NY, USA: ACM, 2010, (pp. 199–208).
[18] Paelke, Volker, Christian Reimann, and Dirk Stichling. "Foot-based mobile interaction with games." Proceedings of the 2004 ACM SIGCHI International Conference on Advances in computer entertainment technology. ACM, 2004.
[19] Lu, Zhihan, Muhammad Sikandar Lal Khan, and Shafiq Ur Réhman. "Hand and foot gesture interaction for handheld devices." Proceedings of the 21st ACM international conference on Multimedia. ACM, 2013.
[20] Akl, A., Feng, C., and Valaee, S. (2011). A novel accelerometer-based gesture recognition system. Signal Processing, IEEE Transactions on, 59(12), 6197-6205.
[21] Liu, J., Zhong, L., Wickramasuriya, J., and Vasudevan, V. (2009). uWave: Accelerometer-based personalized gesture recognition and its applications. Pervasive and Mobile Computing, 5(6), 657-675.
[22] Pylvänäinen, T. (2005). Accelerometer based gesture recognition using continuous HMMs. In Pattern Recognition and Image Analysis (pp. 639-646). Springer Berlin Heidelberg.
[23] Bailador, G., Roggen, D., Tröster, G., and Triviño, G. (2007, June). Real time gesture recognition using continuous time recurrent neural networks. In Proceedings of the ICST 2nd international conference on Body area networks (p. 15). ICST (Institute for Computer Sciences, Social-Informatics and Telecommunications Engineering).
[24] Schiller, N. G., Boris, N., Günther, S., Tsypylma, D., Lale, Y. and László, F. (2004). Pathways of Migrant Incorporation in Germany.TRANSIT, 1(1)
[25] Andersen L. J., Randers M.B., Westh K., Martone D., Riis Hansen P., Junge A., Dvorak J., Bangsbo J., Krustrup P. (2010). Football as treatment of hypertension for untrained 30-55 year old men a prospective randomised study. Scandinavian Journal of Medecine & Science in Sports; 20 (1), (pp. 98-102).
[26] Menelas, B.-A. J. (2014). Virtual Reality Technologies (Visual, Haptics, and Audio) in Large Datasets Analysis. In M. Huang, & W. Huang (Eds.) Innovative Approaches of Data Visualization and Visual Analytics (pp. 111-132). Hershey, PA: Information Science Reference. doi:10.4018/978-1-4666-4309-3.ch006
[27] Menelas, B., Picinalli, L., Katz, B. F., & Bourdot, P. (2010, March). Audio haptic feedbacks for an acquisition task in a multi-target context. In 3D User Interfaces (3DUI), 2010 IEEE Symposium on (pp. 51-54). IEEE.
[28] Menelas, B. A. J., Picinali, L., Bourdot, P., & Katz, B. F. (2014). Non-visual identification, localization, and selection of entities of interest in a 3D environment. Journal on Multimodal User Interfaces, 8, (3), 243-256
[29] Menelas, B. A. J., & Otis, M. J. (2013). Use of Foot for Direct Interactions with Entities of a Virtual Environment Displayed on a Mobile Device. In Systems, Man, and Cybernetics (SMC), 2013 IEEE International Conference on (pp. 3745-3750). IEEE.
[30] Otis, M. D., & Menelas, B. J. (2012) Toward an augmented shoe for preventing falls related to physical conditions of the soil. In Systems, Man, and Cybernetics (SMC), 2012 IEEE International Conference on (pp. 3281-3285).
[31] Figo, D., Diniz, P. C., Ferreira, D. R., and Cardoso, J. M. (2010). Preprocessing techniques for context recognition from accelerometer data. Personal and Ubiquitous Computing, 14 (7), 645-662.
[32] Salvador S. and Chan, P., Toward accurate dynamic time warping in linear time and space, Intelligent Data Analysis, vol. 11, pp. 561-580, 01/01/ 2007
Downloads
Published
Issue
Section
License
IJSG copyright information is provided here.