Academia.eduAcademia.edu

Gesture Based Interaction in Immersive Virtual Reality

2020, Engineering and Scientific International Journal - Divya Udayan J

https://doi.org/10.30726/esij/v7.i2.2020.72011

Recent development in virtual reality (VR) interaction with 3D camera and sensors like kinect, range camera, leap motion controller etc., has enabled opportunity in development of human computer interaction (HCI) application. Hand gesture is one of the popular ways that people use to interact with the computer. Even automatic hand gesture recognition appears as a suitable means for interacting with virtual reality systems. This paper focuses on the study and analysis of the application based on gesture interaction technology in virtual reality. Customizing gestures for pointing, grabbing, zoom in/out, swap were defined and implemented in unity 3D with leap motion SDK. The effectiveness of the hand gesture was analyzed through recording user experience and questionnaire.

Engineering and Scientific International Journal (ESIJ) Volume 7, Issue 2, April – June 2020 ISSN 2394-7187(Online) ISSN 2394 - 7179 (Print) Gesture Based Interaction in Immersive Virtual Reality Divya Udayan J Assistant Professor, Vellore Institute of Technology, Vellore, India. Abstract— Recent development in virtual reality (VR) interaction with 3D camera and sensors like kinect, range camera, leap motion controller etc., has enabled opportunity in development of human computer interaction (HCI) application. Hand gesture is one of the popular ways that people use to interact with the computer. Even automatic hand gesture recognition appears as a suitable means for interacting with virtual reality systems. This paper focuses on the study and analysis of the application based on gesture interaction technology in virtual reality. Customizing gestures for pointing, grabbing, zoom in/out, swap were defined and implemented in unity 3D with leap motion SDK. The effectiveness of the hand gesture was analyzed through recording user experience and questionnaire. Keywords — Hand Gesture; Leap Motion Controller; VR Environment the controller with the actual hand, providing enhanced immersion that feels like a real hand. The paper is 1. Introduction organized as follows: section 2 provides related work reviewed in the study. Section 3 provides proposed Recent developments in virtual reality supporting methodology. Section 4 presents result and discussions. headsets devices like Oculus rift/Go, HTC Vive and Section 5 provides conclusion. Microsoft HoloLens, immersive virtual reality has been applied in various fields like, gamming, entertainment 2. Related Work industry, virtualization of historical monuments, in the field of data visualization and also in the field of medical The purpose of contact-free 3D human-computer like rehabilitation in general, and of brain damage interaction is to enhance the presence by providing the user treatment in particular [1-3] providing an immersive with a realistic interaction with the environment or objects experience to users. However, studies on multimodality provided by immersive virtual reality by utilizing various that enhances immersion is yet to be explored in depth, the senses of the human body such as vision, hearing, and immersive experience to the user interaction methods and touch. In [8-10] supports interaction application technology user-centered interfaces that satisfy higher levels of using gaze, gestures, etc. to enable users to easily control immersion and reality are still required to be enhanced in movements, express behaviors, and realistically feedback order to increase the user's presence. An interaction-based physical reactions occurring in a wide range of virtual feedback system and experience environment that can environments. Recently, Oculus Touch and HTC Vive's satisfy the senses of hearing and touch are important. In controller devices are used to support accurate interactions this regard, virtual reality technology combines with as in [11] has proposed grasping system, which supporting hardware systems such as treadmills and virtual reality real-time interaction in virtual environment, and also in gloves, along with advances in head mounted displays, [12, 13] virtual object is controlled and interacted on including the Oculus Rift CV1 / GO, HTC Vive, and mobile platform using gaze-based hand interaction also Samsung Odyssey. The application studies such as user they have analyzed gestures and movements by capturing interface in immersive virtual reality that can directly hands through markers to reflect the behavior more directly interact with virtual environment and realistic control of in the virtual environment [14]. Interaction using haptic objects and haptic feedback based on it have been devices is also studied [15], where haptic interaction conducted from various viewpoints until now [4-6]. In system calculates the distance of hand movement and addition, recent development in Leap Motion hand measures the force generated during the object control tracking sensor [7] has enabled us to precisely track and process to feedback the experience such as heaviness and enhance VR interaction experience. Leap motion enabled the sense of touch [16-18]. Various interactive studies are VR allowing the user to explore, interact and manipulate conducted in relation to the data glove, a representative scene objects as in the real world. In this paper we have tactile interface to the hand, the system also provided proposed a methodology that combines leap motion with physical feedback along with the measurement of force, oculus rift head mounted set, we design a hand motion using various mechanisms such as a system that combines sensor-based interaction method that accurately detects a wire drive and a manual spring [19, 20]. Recently, there and tracks hand movements without wearing additional are also active researches on analyzing factors that can equipment, and responds to various motions and gestures improve presence in terms of interaction [23, 24]. to virtual hands. The dedicated controller provided with Therefore, we propose an interactive method that optimizes the VR head mounted device (HMD) to design a controller-based interaction method that maps the keys of the interface through input methods that are more 52 DOI: 10.30726/esij/v7.i2.2020.72011 Engineering and Scientific International Journal (ESIJ) Volume 7, Issue 2, April – June 2020 accessible and familiar from the user's point of view, using existing popular technologies. It is also important to conduct an experiment to analyze the process and the extent to which the presence changes. From this point of view, this paper aims to analyze the detailed factors that affect the presence through the interaction and the interface that can increase the user's immersion in the minimal experience environment. ISSN 2394-7187(Online) ISSN 2394 - 7179 (Print) addition, the functions provided by the leap motion development tool can be used to check the current state of the hand and finger, thereby defining the gesture. 3.1 User Interaction with VR Environment We have defined four gestures to interact with any of VR based applications. The Table 1 gives the detail of the gesture and their functionality. 3. Proposed Methodology In proposed methodology, the sensor based interaction approach handle inputs freely using hands without additional equipment in VR environment. In order to interact with hands directly, it is necessary to accurately detect and track hand movements, and classify and recognize motions and gestures based on them. In our proposed system we use leap motion equipment, which is popularly used in the VR interaction research. Previously, studies on the optical hand and surface markers worn on the hand and tracking them to map the behavior of the virtual hand model [23-24] have been conducted. When approaching real-world applicability, we take advantage of the leap motion equipment that provides a library that can be developed on the game engine at a lower cost. The leap motion sensor is an input processing device consisting of two infrared cameras: an infrared recognition module and an infrared light source (LED). It has a small size of 1.27mm × 80mm and can be easily used by attaching it to the front surface of virtual reality HMD such as Oculus rift, HTC VIVE, etc. When the user moves the hand in front of the infrared sensor, the movement is recognized in units of finger joints and accurately corresponds to the hand of the virtual environment. Fig. 1 shows how the leap motion development tool integrates with the Unity 3D engine to create a development environment. Table 1. Customizing different hand gesturesto interact with VR application Gesture Function Pointing Pointing gesture is used to point an 3D object to perform a particular task for example change the color of object or menu interaction Grasping It is used to grasp an object and interact with it like rotate an object, change the position of the object. Zoom in/out The zoom in gesture increases the size of the object. The zoom out gesture reduces the size of the object Swap Swap gesture move the object Fig. 1: Leap motion interface integration with unity 3D It is possible to control and synchronize a threedimensional hand joint model corresponding to the recognized hand through the leap recognition space. In The recognized hand model information (hand) is stored using the functions provided in interaction engine by the leap motion development tool. Then, the actions such as grasping, opening, pointing to an object and zooming and zooming out the object defined. Algorithm 1 shows how the hand model is stored and interaction is initiated. First, the hand state is detected from hand model function as shown in Algorithm 1, hand model returns three value one is hand state which represent whether the user has 53 DOI: 10.30726/esij/v7.i2.2020.72011 Engineering and Scientific International Journal (ESIJ) Volume 7, Issue 2, April – June 2020 opened or closed his hands and then the next value is state_point which indicates that the user is pointing and if the user has extended two fingers then it is represented by state_zoom 18. endif 19. endif 20. end procedure. 4. Result and Discussion Table 2. Algorithm1 Hand_Model Algorithm to detect the state of the hand 1. Hand H= HandModel.GetLeapHand() 2. Number_of_fingers =0 3. if H.IsRight == true then 4. for i=0 to 4 do 5. if H.Fingers[i].IsExtended == true then 6. Number_of_fingers ++ 7. end if 8. end for 9. if Number_of_fingers == 0 then 10. state_grasp =true 11. else if H.Fingers[1].IsExtended Number_of_fingers=2 then 12. state_Zoom= true 13. else if H.Fingers[1].IsExtended Number_of_fingers=1 then 14. State_point= true 15. end else if 16. end else if 17. end if ISSN 2394-7187(Online) ISSN 2394 - 7179 (Print) ==true and ==true and Experimental virtual reality application is created for the purpose of analyzing whether user's hands-based actions in the immersive virtual reality are convenient and immersive at the same time through the two interactions proposed for the comparative experiment on the presence of the hand-based interface Fig. 2 is the scene of the experimental application produced in this study. It consists of the interaction process using the actions like picking the object, pointing towards to the object, etc. In order to analyze the presence in the interaction using hands more accurately by presenting a realistic experimental environment in the experience environment called virtual reality, it is composed of some basic 3D objects rather than an application such as a game. Once the hand state is detected using hand interaction function corresponding actions are performed Algorithm 2 gives the detail of hand interaction procedure. Fig. 2: Experimental testbed interface in Unity 3D for sensor based interaction Table 3. Algorithm 2 Hand_Interaction Algorithm to interact with hand model 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. state_grasp = grasp the left/right state_point =point to the left/right hand. state_zoom= zoom Procedure Hand_interaction ( hand_state hand_point, hand_zoom) grasp_count = check grasping state. if state_grasp = = true then if grasp_count == false then initiate grasp process grasp_count = true end if else if state_point == true then perform pointing action. else if state_zoom == true then Perform zoom action. else if contgrasp == true then perform dropping (opening) action , The VR application is build and tested on Unity 3D. The PC environment used for the interface implementation and experiment was equipped with Intel Core i7-8700, 32GB RAM, and Quadro P5000 GPU. In experiment we have used the Oculus rift HMD and its dedicated controller, Oculus Touch, to support the virtual reality experience. Fig. 3 shows the environment for experiencing virtual reality through two interactions suggested in this study. Fig. 3: Sensor based interaction with VR environment using leap motion and oculus rift 54 DOI: 10.30726/esij/v7.i2.2020.72011 Engineering and Scientific International Journal (ESIJ) Volume 7, Issue 2, April – June 2020 Sitting or standing in a standard sized space in our experiment we have used 3 X 3 m, it was a comfortable experience. The hand motion sensor based interaction recognizes the hand through the leap motion sensor attached to the front of the HMD. Controller-based interactions are set up to hold a dedicated controller in hand. ISSN 2394-7187(Online) ISSN 2394 - 7179 (Print) These questions followed likert scale: strongly agree, agree, somewhat agree, neutral, somewhat disagree, disagree and strongly disagree. The average response of the user is calculated as per following equations. 4.1. Discussions We have analyzed the interaction experience of the user by conducting the survey, while creating questionnaire for survey we kept all the factors that has impact on the interaction experience and included in the form of questions. Three main categories: finger movement experience, interaction experience and hand motion experience we focused while survey. A total 22 participates with different age group were considered. Ten people had VR experience before and others were new to VR. The proficiency required for manipulating virtual objects in the proposed application can also affect presence, so we asked ten people to first experience hand-based sensor-based interactions and then experience controller-based interactions. For preparing questionnaire we have followed [26] [11]. Table 4. Questionnaire for the study of user interaction experience Q. No. 1 2 3 4 5 6 7 8 9 10 11 12 13 Questions Hand movement Experience I felt like the virtual hands were like my own hands. I was able to feel the movements of virtual hand as I moved my own hand. I felt as if the movements of the virtual hands were influencing my own movements. I felt as if the virtual hands had no co relation with my hand movements. Finger Movement experience I was able to move virtual fingers as I intended to. Virtual Fingers interacted with the objects as per my intention I felt finger movement was not real. Interaction Experience I felt like I was grabbing the object as I intended to. I found hard to reach out to the objects. I felt like finger movement while interaction with objects as if my own hand movements I found difficult to understand the movement of virtual hand. I felt interaction with objects as if it was real. Fingers were properly adapting properly to the different geometries. Where Rhand_movement , Rfinger_movement and Rinteraction are the average user response for hand movement, finger movement and interaction experience respectively in VR environment. Finally the overall responseRoverallresponse from the user is calculated using the following equation The Fig. 4 show user response, for the given questioners for question Q1,Q2,Q6and Q8,Q10 and Q13 user average response is between 1.8 to 2.3 on a point scale of 3,as these questions intended to measure user hand and finger movement experience, thus most of the user were able to experience immersive interaction experience. The Fig. 5 shows the user experience in VR measured for hand movement, finger movement and their interaction with virtual objects. About 78.4% of them were able to move their hand and have immersive experience with hand movement, 31.9% of them were able to move their fingers and 96% of them find it easy to interact with the object like grasping, moving and changing the size of objects. Fig. 4: Graphical representation of User response 55 DOI: 10.30726/esij/v7.i2.2020.72011 Engineering and Scientific International Journal (ESIJ) Volume 7, Issue 2, April – June 2020 ISSN 2394-7187(Online) ISSN 2394 - 7179 (Print) [5] [6] [7] [8] [9] Fig. 5: Graphical representation user VR interaction experience 5. Conclusion In this paper, we have defined the gestures like pointing, grasping, zoom in/out, swap to interact with the objects in virtual environment using leap motion controller and oculus rift. Further, user interaction experience using gesture with the VR environment was evaluated by recording the user experience through the questioners. Through the user response evaluation we found that about overall 73% of the users were able to interact had an immersive interaction experience in VR environment. Further, while collecting user response we observed that few users found difficult to reach out object, hence to improve the immersive interaction few more gesture can be added like for moving VR cameras, changing the scene etc.This work can also be extended to extended reality were XR devices like Google Tilt Brush can be used for visualization of VR environment. [10] [11] [12] [13] [14] [15] ser. CVPR ’18, vol. abs/1801.01615. Washington, DC, USA: IEEE Computer Society, pp. 8320–8329, (2018). M. Kim, J. Lee, C. Jeon, and J. Kim: A study on interaction of gaze pointer-based user interface in mobile virtual reality environment, Symmetry, vol.9, no.9, pp.189, (2017). S. Marwecki, M. Brehm, L. Wagner, L.-P. Cheng, F. F. Mueller, and P. Baudisch: Virtual space – over loading physical space with multiple virtual reality users, in Proceedings of CHI Conference on Human Factors in Computing Systems, ser. CHI ’18. New York, NY, (2018). Leap Motion Homepage,https://www.leapmotion.com/. P.LindemannandG.Rigoll : A diminished reality simulation for driver-car interaction with transparent cockpits, in IEEE Virtual Reality (VR). IEEE, pp. 305–30618-22, (2017). T. Pfeiffer, Understanding Multimodal Deixis with Gaze and Gesture in Conversational Interfaces. Aachen, Germany: Shaker Verlag GmbH, (2011). T. Pfeiffer : Using virtual reality technology in linguistic research, IEEE Virtual Reality Workshops (VRW). IEEE, pp. 83–84,(2012). Oprea, S., Martinez-Gonzalez, P., Garcia-Garcia, A., Castro-Vargas, J. A., Orts-Escolano, S., & Garcia-Rodriguez, J.: A Visually Plausible Grasping System for Object Manipulation and Interaction in Virtual Reality Environments. arXiv preprint arXiv:1903.05238. A. Henrysson, M. Billinghurst, and M. Ollila: Virtual object manipulation using a mobile phone, in Proceedings of the International Conference on Augmented Tele existence, ser. ICAT ’05. New York, NY, USA pp. 164–171,(2005). S. Han and J. Kim: A study on immersion of hand interaction for mobile platform virtual reality contents, Symmetry, vol. 9, no. 2, p. 22, (2017). W. Zhao, J. Chai, and Y.-Q. Xu: Combining markerbased mocap and rgb-d camera for acquiring high-fidelity hand motion data, in Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation, ser. SCA ’12. Aire-la-Ville, Switzerland, pp. 33–42, (2012). M. Kim, C. Jeon, and J. Kim :A study on immersion and presence of a portable hand haptic system for immersive virtual reality,” Sensors, vol. 17, no. 5, p. 1141, (2017). Acknowledgment The authors thank Vellore Institute of Technology for providing ‘VIT SEED GRANT’ for carrying out this research work. References [1] Jocelyn S McGee, Cheryl Van der Zaag, J Galen Buckwalter, Marcus Thiébaux, Andre Van Rooyen, Ulrich Neumann, D Sisemore and Albert A Rizzo.: Issues for the assessment of visuo-spatial skills in older adults using virtual environment technology. CyberPsychology& Behavior, vol. 3, no. 3, pp 469–482, (2000). [2] FDavidRose,BarbaraMBrooksandAlbertARizzo.:Virtualrealityinbrai n damage rehabilitation: review. CyberPsychology & Behavior, vol. 8, no. 3, pp. 241–262, (2005). [3] Patrice L Weiss, Rachel Kizony, Uri Feintuch and Noomi Katz.: Virtual reality in neurorehabilitation. Textbook of neural repair and neurorehabilitation, vol. 2, pp. 182–197,( 2006). [4] H. Joo, T. Simon, and Y. Sheikh: Total capture: A 3d deformation model for tracking faces, hands, and bodies, in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 56 DOI: 10.30726/esij/v7.i2.2020.72011