Academia.eduAcademia.edu

Human-Machine Real-Virtual Haptic Interaction Systems

virtuni.eas.sk

This paper describes preliminary versions of two human-machine real-virtual haptic interaction systems: a unilateral real-virtual teleoperation system, and a simple bilateral haptic game. Both applications were designed and implemented by computer engineering undergraduate students in collaboration with their teachers. The main objective of this work was to create a basis system that permits to evaluate the effects of force feedback in applications involving interaction between human operators and virtual environments.

Human-Machine Real-Virtual Haptic Interaction Systems António M. Lopes1, 3, Júlio Santos3, Hugo Peixoto3, A. Augusto Sousa2, 3, M. Teresa Restivo1, 3 1 UISPA-IDMEC, 2INESC Porto, 3Faculdade de Engenharia da Universidade do Porto Rua Dr. Roberto Frias, 4200-465 Porto, Portugal [email protected], [email protected], [email protected], [email protected], [email protected] Abstract This paper describes preliminary versions of two human-machine real-virtual haptic interaction systems: a unilateral real-virtual teleoperation system, and a simple bilateral haptic game. Both applications were designed and implemented by computer engineering undergraduate students in collaboration with their teachers. The main objective of this work was to create a basis system that permits to evaluate the effects of force feedback in applications involving interaction between human operators and virtual environments. 1. Introduction A haptic device is a human-machine interface, able to provide the user with tactile stimulation and force feedback [1]. It may be used to allow the operator to interact with either real or virtual environments. In the first case, the haptic device, driven by the operator, plays the role of a master, in a master/slave force feedback application (e.g., telemanipulation or teleoperation). In the second case, the haptic device allows interaction with a software application (e.g., video-games or simulators) [2]. Usually, vision and hearing information are associated to the haptic systems to significantly improve the realism of the performed tasks [3, 4]. The majority of haptic devices transmit sensations to hands and fingers, because they are mainly used in manipulation tasks. Therefore, the mechanical structure of a haptic device is as important as its technical specifications, such as the number of input and output degrees of freedom (DOF) and force capability. Those features will define the appropriateness of such a device for a specific application [2]. In this paper, preliminary versions of two human-machine real-virtual haptic interaction systems are presented. The first one is a unilateral real-virtual teleoperation system, in which the human operator drives a virtual manipulator using a real haptic device as the interface. The second one is a simple bilateral haptic game in which a real haptic device interface allows a human operator to interact with a virtual environment. The two applications have been designed and implemented by computer engineering undergraduate students, at the School of Engineering of the University of Porto, Portugal, in collaboration with their teachers. The main objective of this work was to create a basis system that permits to evaluate the effects of using haptic devices (and, generally, force feedback) in applications involving interaction between human operators and virtual environments, and to get insight of the benefits and problems associated with teleoperation robotic tasks. This paper is organized as follows. Section 2 presents the unilateral real-virtual teleoperation system. The bilateral haptic game is presented in Sec. 3. Conclusions are drawn in Sec. 4 2. The Unilateral Real-Virtual Teleoperation System Robots have many advantages over humans, including tireless vigilance, increased precision, fast processing, and fast response. Humans, on the other hand, can quickly develop and use models that predict system behavior, and have access to a rich set of sensory inputs. Thus, a challenge in man-machine collaborative robotic systems is to combine the advantages of robots and humans on carrying out a given task [5, 6]. Generally speaking, haptic man-machine collaborative robotic systems may include: • haptic cooperative manipulation systems, in which the robot and the operator simultaneously grasp and manipulate a tool or object; • haptic manipulation systems for training, in which a kinesthetic interface is placed between a human operator and a virtual environment; • haptic teleoperated manipulation systems, in which a remote slave robot follows the motions of a master human-operated device. Usually, teleoperation has potential benefits in many applications, ranging from material handling in hazardous or inaccessible environments (ex. nuclear, space, chemical and biological environments), to minimally invasive surgery and telemedicine [6]. In particular, force reflecting, or haptic, teleoperation systems are far more useful because force feedback provides operators with more complete information, increasing their sense of being present at the slave site (telepresence). A key issue in a teleoperation system is transparency i.e., the ability of the system to present an undistorted dynamics of the environment to the user. To get insight of the benefits and problems associated with teleoperation robotic tasks, a real-virtual teleoperation application was developed. The real-virtual haptic teleoperation system comprises two robotic manipulators: a real master and a virtual slave. The master robot is a commercial, 3-DOF Phantom Haptic device, model 1.5HF, from Sensable Inc. (USA), which is driven by the operator. The slave robot is a virtual replica of the master (Fig. 1). and to the topology of its arms and joints. In this phase, one should be able to visualize and control the articulations of the model using the keyboard. With the use of common measuring techniques and with the Blender software [7] as a modeling tool, this phase was completed successfully, and the results are, as it can be seen in Fig. 2, very detailed. Fig. 2. Virtual environment showing slave manipulator Fig. 1. Real and virtual manipulators In this first version, the teleoperation robotic system uses unilateral communication between the master and slave robots. Thus, communication takes place in only one direction: position references from the master are transmitted to the slave. A displacement scale factor may be used. As force feedback is essential for the operator, not only for security concerns, but also because some tasks are almost impossible to execute without force feedback, a further version of the teleoperation system, allowing bilateral communication between master and slave is under development. The operator will be able to feel the appropriate force feedback whenever the virtual slave manipulator touches its virtual environment. 2.1. Real-Virtual Teleoperation System: implementation details The purpose of the first application was to provide a basic playground for familiarization with the haptic device technology. The model used in this study includes a low level API, named HDAPI, which provides the means to read the coordinates of the device, including information about the current state, and, in the inverse direction, to change the voltage of each motor, through a DAC device. There were three main steps in the development of this application. Firstly, the machine was to be modeled, using any 3D modeling software, with special regard to the length ratios The computer graphics library of choice was OpenGL [8, 9], following recommendation from Sensable Inc. It is recommended to do so due to their higher level library API, which is designed to be similar to OpenGL. That library, HLAPI, was to be used on the second application, and as such, it was decided to go with OpenGL in the implementation of both programs. Articulation management was made possible due to the creation of a simple library which loads, from a humanreadable file, a set of building blocks, each composed by an identifier and a path to a 3DS model created by the Blender software. It also loads the articulations definitions, which specify the child and the parent block, the docking position on each of them and an identifier, so that it would be possible to control the angle programmatically. The second step involved mapping the physical device state into the previously created virtual model. It had to be able to react in real time to changes in the arm’s position and rotation. The HDAPI allows the programmer to directly read the device state in several different coordinate spaces, after proper initialization and calibration. One of those spaces reflects the current state of each joint in the device, including the gimbal angles. The resulting mapping was shown to be quite fluid, displaying no signs of either delay or frame loss. Finally, as a last stage of development, the opposite functionality was added: the ability to control the physical device through the virtual model, using the keyboard. The library used does not provide the methods to directly set the articulations angles. Instead, it is equipped with a method that changes the voltage of each of the main three joints. This tool allowed the development of a simple servo mechanism which would receive the set of articulation angles and changed the voltage of each joint, until it converged to the desired position. The application maintains a list of the three desired articulation angles, which can be altered by the user. Using a callback mechanism, the program periodically checks if the current physical device angles differ from the ones specified by the user. If they do, a small voltage is applied to that joint in the right direction. This causes the device to converge to the indicated position with a negligible, small error margin. This was obtained using the following formula, which was applied to each of the three articulations: DACi = DACibase ⎞ ⎛ Δα ×⎜ + 0.5 ⎟ ⎠ ⎝ 20 Δα is the difference between the current angle and the desired one, in degrees, and DACibase is a base value, dimensioned in such a way that, articulations with vertical movement have a generally higher voltage applied to them, as there is the need to compensate for gravity. The base values were determined empirically, and throughout the process it was observed that some values could generate instability in the device. Having tweaked those base values, the third step of the first application was completed successfully. The methods here described to interact with the Phantom are not suitable for general programming with haptic environments, as the purpose of the library used is to provide the developer with low level access to the device. The proper methods for that kind of development are described in the next section. 3. Real-Virtual Bilateral Haptic Game Simulators comprising haptic feedback have been used in the last few years in a variety of training applications [10]. Typically, the simulator comprises a haptic device as an interface between the human operator and a software virtual reality (VR) application. The human operator uses the haptic device both, to interact with the VR application and to receive force feedback resulting from the interaction. This kind of simulators has been adopted in many practical applications, due to safety, ethical, and economical reasons. The implemented real-virtual bilateral haptic game may be regarded as a haptic simulator for training. It was developed as a basis system to investigate the benefits of using haptic devices in generic training tasks. The haptic game uses the already described Phantom Haptic device as the interface between the operator and the VR environment. Here, balloons are launched and the operator has to catch them. When the operator touches a balloon, he feels the impact at his hand throughout the haptic interface. Catching a balloon increases the operator personnel score by a predefined amount of points. This application illustrates a simple bilateral interaction task: motion is imposed by the operator, and force feedback is received. If should be noted that there are two major problems that must be solved to implement such a kind of application. First, it is necessary to develop realistic graphical environments for a given task. Second, a mathematical model of the environment must be obtained in order to compute realistic force feedback. 3.1. Real-Virtual Bilateral Haptic Game: implementation details The first experiments with the haptic device provided the familiarization with the development environment needed to proceed to the current application. This application, in the shape of a computer game, poses a technical challenge. There is the need to map a virtual environment, capable of interacting with the device feedback system. The game scene consists in a set of balloons, sphere shaped, that show up in the bottom of the screen and float moving upwards. The player controls a cursor, that when pressed against a balloon with enough pressure, pops it. There are four types of balloons available, each with different color, size, and stiffness. This variety allows the operator to explore with greater detail the possibilities given by the Phantom Haptic device. For this task, Sensable Inc. has a higher level API that simplifies the creation of virtual objects in a manner similar to OpenGL primitives. Initialization details are not covered in this paper, as they are thoroughly described in the OpenHaptics Toolkit Programmer’s Guide [11]. Usually, in an usual OpenGL application, there is a display method, called in each frame render, where the primitives to be drawn are defined. A simple routine that creates a square on the OpenGL scene is presented: void display() { glBegin(GL_POLYGON); glVertex3f(0, 0, 0); glVertex3f(1, 0, 0); glVertex3f(1, 1, 0); glVertex3f(0, 1, 0); glEnd(); } In order to create a square in the haptic device virtual world, so that the user could interact with it, the programmer just has to add the initialization and finalization statements around the square definition: void display() { hlBeginFrame(); hlBeginShape(HL_SHAPE_DEPTH_BUFFE R, sId); glBegin(GL_POLYGON); glVertex3f(0, 0, 0); glVertex3f(1, 0, 0); glVertex3f(1, 1, 0); glVertex3f(0, 1, 0); glEnd(); hlEndShape(); Fig. 3. Virtual environment of the balloons game hlEndFrame(); } A unique shape identifier must be referenced in the initialization call, so that the haptic engine recognizes changes in each shape from frame to frame. The hlBeginFrame() and the hlEndFrame() statements tell the engine that a new frame is being defined. This simple change is sufficient to map the square into the haptic device, allowing the user to interact with it. Using this method, the game base platform was developed. The user could feel the balloons touching the cursor, as they floated up. To implement the balloon popping feature, the set of properties made available by the HLAPI was investigated. To each shape, the developer could associate a set of properties that would define its material. Those properties are: stiffness, the spring constant in Hooke’s Law; damping, the viscous damping coefficient; friction, both static and dynamic friction coefficients; and popthrough, which is the only property without a real-word counterpart. Popthrough defines how hard the device must push against a surface before it pop through to the other side. Having this property correctly tweaked allowed the user to feel like he was actually popping a balloon. During the tests of the final application, shown in Fig. 3, a small usability issue was detected. The computer screen is mapped into a 2D coordinate system, while the Phantom device is a 3D system. This may cause problems in the user’s depth perception when maneuvering the cursor back and forth, since there is one dimension missing. In this application, the user needs a few moments of experimenting with the game in order to become familiarized with the navigation. This issue may have harsher consequences in a more complex environment, and it must be dealt with. A possible solution to this problem may lie on the use of cast shadows to induce depth perception [12]. 4. Conclusion This paper presented preliminary versions of two humanmachine real-virtual haptic interaction systems. The two applications have been designed and implemented to investigate the benefits of force feedback in applications involving interaction between human operators and virtual environments. The work will be followed by new developments which shall include more realistic and helpful VR applications. In the same context but under a little different perspective, some of the authors are using haptic devices in remote laboratories [13], in order to increase the realism of experiments in the area of mechanical engineering. 5. References [1] Kalawsky, R. S., The Science of Virtual Reality and Virtual Environments, Addison-Wesley, 2004. [2] L. Machado, J. Mendes, A. Lopes, B. Sales, T. Pereira, D. Souza, M. Restivo, R. Moraes, “A Remote Access Haptic Experiment for Mechanical Material Characterization”, Proc. of the 8th Portuguese Conference on Automatic Control, Portugal, 2008, pp. 870-874. [3] P. Buttolo, R. Oboe, B. Hannaford, “Architectures for Shared Haptic Virtual Environments”, Computers and Graphics, 21, 1997, pp. 421-429. [4] G. Robles-De-La-Torre, “The Importance of the Sense of Touch in Virtual and Real Environments”, IEEE MultiMedia, 13, 2006, pp. 24-30. [5] M. Steele, B. Gillespie, “Shared Control Between Human and Machine: Using a haptic interface to aid in land vehicle guidance”, Proc. of the Human Factors and Ergonomics Society 45th Annual Meeting, USA, 2001, pp. 1671–1675. [6] J. Abbot, A. Okamura, “Stable Forbidden-Region Virtual Fixtures for Bilateral Telemanipulation”, Journal of Dynamic Systems, Measurement and Control, 128, 2006, pp. 53-64. [7] www.blender.org. [8] Shreiner, D., M. Woo, J. Neider, T. Davis, OpenGL Programming Guide: The Official Guide to Learning OpenGL, Version 2, Addison-Wesley, 2005. [9] www.opengl.org. [10] R. Moraes, L. Machado, “Fuzzy Bayes Rule for OnLine Training Assessment in Virtual Reality Simulators”, Journal of Multiple-Valued Logic and Soft Computing, 14, 2008, pp. 325-338. [11] Sensable Technologies, Programmer’s Guide, 2005. OpenHaptics Toolkit [12] D. J. Kersten, P. Mamassian, D. Knill, “Moving cast shadows generate illusory object trajectories”, Investigative Ophthalmology and Visual Science, 32, 1991, pp. 1179. [13] M. Restivo, J. Mendes, A. Lopes, C. Silva, R. Magalhães, M. Chouzal, “E-Teaching Mechanical Material Characteristics”, Proc. of the M2D’2006, 5th Int. Conf. on Mechanics and Materials in Design, Porto, Portugal, 2006.