Academia.eduAcademia.edu

The touchy subject of haptics

2011, Communications of the ACM

After more than 20 years of research and development, are haptic interfaces finally getting ready to enter the computing mainstream? E Ve R S I N Ce the first silentmode cell phones started buzzing in our pockets a few years ago, many of us have unwittingly developed a fumbling familiarity with haptics: technology that invokes our sense of touch. Video games now routinely employ force-feedback joysticks to jolt their players with a sense of impending onscreen doom, while more sophisticated haptic devices have helped doctors conduct surgeries from afar, allowed deskbound soldiers to operate robots in hazardous environments, and equipped musicians with virtual violins. Despite recent technological advances, haptic interfaces have made only modest inroads into the mass consumer market. Buzzing cell phones and shaking joysticks aside, developers have yet to create a breakthrough product-a device that would do for haptics what the iPhone has done for touch screens. The slow pace of market acceptance stems partly from typical new-technology growing pains: high production costs, the lack of standard application programming interfaces (APIs), and the absence of established user interface conventions. Those issues aside, however, a bigger question looms over this fledgling industry: What are haptics good for, exactly? Computer scientists have been exploring haptics for more than two decades. Early research focused largely on the problem of sensory substitution, converting imagery or speech information into electric or vibratory stimulation patterns on the skin. As the technology matured, haptics found new applications in teleoperator systems and virtual environments, useful for robotics and flight simulator applications. Today, some researchers think the big promise of haptics may involve

news Technology | DOI:10.1145/1866739.1866746 Alex Wright the touchy subject of haptics After more than 20 years of research and development, are haptic interfaces finally getting ready to enter the computing mainstream? 20 c ommu n ic at ion s of t h e ac m moving beyond special-purpose applications to tackle one of the defining challenges of our age: information overload. For many of us, a growing reliance on screen-based computers has long since overtaxed our visual senses. But the human mind comes equipped to process information simultaneously from multiple inputs—including the sense of touch. “People are not biologically equipped to handle the assault of information that all comes through one channel,” says Karon MacLean, a professor of computer science at the University of British Columbia. Haptic interfaces offer the promise of creating an auxiliary information channel that could offload some of the cognitive load by transmitting data to the human brain through a range of vibrations or other touch-based feedback. “In the real world things happen on the periphery,” says Lynette Jones, a senior research scientist at Massachusetts Institute of Technology. “It seems like haptics might be a good candidate for exploiting that capability because it’s already a background sense.” As people consume more information on mobile devices, the case for haptics seems to grow stronger. “As screen size has become smaller, there is interest in offloading some information that would have been presented visually to other modalities,” says Jones, who also sees opportunities for haptic interfaces embedded in vehicles as early warning systems and proximity indicators, as well as more advanced applications in surgery, space, undersea exploration, and military scenarios. While those opportunities may be real, developers will first have to overcome a series of daunting technical obstacles. For starters, there is currently no standard API for the various force feedback devices on the market, although some recent efforts have resulted in commercial as well as open source solutions for developing software for multiple haptic hardware platforms. And as haptic devices grow about the size of a cat, the haptic creature produces different sensations in response to human touch. insert: the haptic creature with furry skin. | Ja N Ua Ry 2 0 1 1 | vOL . 5 4 | N O. 1 PHOTO GR APHS BY ST EVE YO HANAN E Ve R SI N C e t he first silentmode cell phones started buzzing in our pockets a few years ago, many of us have unwittingly developed a fumbling familiarity with haptics: technology that invokes our sense of touch. Video games now routinely employ force-feedback joysticks to jolt their players with a sense of impending onscreen doom, while more sophisticated haptic devices have helped doctors conduct surgeries from afar, allowed deskbound soldiers to operate robots in hazardous environments, and equipped musicians with virtual violins. Despite recent technological advances, haptic interfaces have made only modest inroads into the mass consumer market. Buzzing cell phones and shaking joysticks aside, developers have yet to create a breakthrough product—a device that would do for haptics what the iPhone has done for touch screens. The slow pace of market acceptance stems partly from typical new-technology growing pains: high production costs, the lack of standard application programming interfaces (APIs), and the absence of established user interface conventions. Those issues aside, however, a bigger question looms over this fledgling industry: What are haptics good for, exactly? Computer scientists have been exploring haptics for more than two decades. Early research focused largely on the problem of sensory substitution, converting imagery or speech information into electric or vibratory stimulation patterns on the skin. As the technology matured, haptics found new applications in teleoperator systems and virtual environments, useful for robotics and flight simulator applications. Today, some researchers think the big promise of haptics may involve news more complex, engineers will have to optimize for a much more diverse set of sensory receptors in the human body that respond to pressure, movement, and temperature changes. As the range of possible touchbased interfaces expands, developers face a further hurdle in helping users make sense of all the possible permutations of haptic feedback. This lack of a standard “haptic language” may prove one of the most vexing barriers to widespread market acceptance. Whereas most people have by now formed reliable mental models of how certain software interfaces should work—keyboards and mice, touchpads, and touch screens, for example—the ordinary consumer still requires some kind of training to associate a haptic stimulation pattern with a particular meaning, such as the urgency of a phone call or the status of a download on a mobile device. The prospect of convincing consumers to learn a new haptic language might seem daunting at first, but the good news is that most of us have already learned to rely on haptic feedback in our everyday lives, without ever giving it much thought. “We make judgments based on the firmness of a handshake,” says Ed Colgate, a professor of mechanical engineering at Northwestern University. “We enjoy petting a dog and holding a spouse’s hand. We don’t enjoy getting sticky stuff on our fingers.” Colgate believes that advanced haptics could eventually give rise to a set of widely recognized device behaviors that go well beyond the familiar buzz of cell phones. For now, however, the prospect of a universal haptic language seems a distant goal at best. “Until we have a reasonably mature approach to providing haptic feedback, it’s hard to imagine something as sophisticated as a haptic language arising,” says Colgate, who believes that success in the marketplace will ultimately hinge on better systems integration, along the lines of what Apple has accomplished with the iPhone. “Today, haptics is thought of as an add-on to the user interface,” says Colgate. “It may enhance usability a little bit, but its value pales in comparison to things you can do with graphics and sound. In many cases, the haptics is so poorly implemented that people turn it off pretty HPC as haptic devices grow more complex, engineers will have to optimize for a much more diverse set of sensory receptors in the human body that respond to pressure, movement, and temperature changes. quickly. And that’s not to criticize the developers of haptics—it’s just a tough problem.” Many efforts to date have used haptics as a complementary layer to existing screen-based interfaces. MacLean argues that haptics should do more than just embellish an interaction already taking place on the screen. “A lot of times you’re using haptics to slap it on top of a graphical interaction,” she says. “But there can also be an emotional improvement, a comfort and delight in using the interface.” Led by Ph.D. candidate Steve Yohanan, MacLean’s team has built the Haptic Creature, a device about the size of a cat that simulates emotional responses. Covered with touch sensors, the Haptic Creature creates different sensations—hot, cold, or stiffening its “ears” in response to human touch. The team is exploring possible applications such as fostering companionship in older and younger people, or treating children with anxiety disorders. MacLean’s team has also developed an experimental device capable of buzzing in 84 different ways. After giving users a couple of months to get familiar with the feedback by way of an immersive game, they found that the process of learning to recognize haptic feedback bore a great deal of similarity to the process of learning a language. Students Build Green500 Supercomputer a team of students at the University of Illinois at Urbana-Champaign (UIUC) have built an energy-efficient supercomputer that appeared on both the Green500 and top500 lists. Named in honor of one of the UIUC campus’s main thoroughfares, the Green Street supercomputer placed third in the Green500 list of the world’s most energyefficient supercomputers, with a performance of 938 megaflops per watt. It also placed 403rd in the top500 list, a ranking of the world’s fastest supercomputers, with a performance of 33.6 teraflops. the Green Street supercomputer grew out of an independent study course led by Bill Gropp, the Bill and Cynthia Saylor Professor of Computer Science, and wen-mei hwu, who holds the amD Jerry Sanders Chair of electrical and Computer engineering. approximately 15 UIUC undergraduate and graduate students helped build the supercomputer, which boosts a cluster of 128 graphics processing units donated by NVIDIa, and uses unorthodox supercomputer building materials, such as wood and Plexiglas. the UIUC team hopes to increase the supercomputer’s energy efficiency by 10%–20% with better management of its message passing interface and several other key elements. “You really need to make sure that the various parts of your communications path, in terms of different software layers and hardware drivers and components, are all in tune,” says hwu. “It’s almost like when you drive a car, you need to make sure that all these things are in tune to get the maximum efficiency.” the Green Street supercomputer is being used as a teaching and research tool. —Graeme Stemp-Morlock Ja N Ua Ry 2 0 1 1 | vOL . 5 4 | N O. 1 | c ommu n ic at ion s of t he ac m 21 news “The surprising thing is that people are able to quickly learn an awful lot and learn it without conscious attention,” says MacLean. “There’s a lot of potential for people to learn encoded signals that mean something not in a representational way but in an abstract way without conscious attention.” To date, most low-cost haptic interfaces have relied exclusively on varying modes of vibration, taking advantage of the human skin’s sensitivity to movement. But vibration constitutes the simplest, most brute-force execution of haptic technology. “Unfortunately,” says Colgate, “vibration isn’t all that pleasing a sensation.” Some of the most interesting research taking place today involves expanding the haptic repertoire beyond the familiar buzz of the vibrating cell phone. At MIT, Jones’ team has conducted extensive research into human body awareness and tactile sensory systems, examining the contribution of receptors in the skin and muscles to human perceptual performance. In one study, Jones demonstrated that users were unable to distinguish between two thermal inputs presented on a single finger pad; instead, they perceived it as a single stimulus, demonstrating the tendency of thermal senses to create “spatial summation” rather than finetuned feedback. Colgate’s research has focused on a fingertip-based interface that provides local contact information using new actuation technologies including shear skin stretch, ultrasonic, and thermal actuators. By varying the fric- tion in correspondence with fingertip motion across a surface, the interface can simulate the feeling of texture or a bump on the surface. Compared with force-feedback technology, vibrotactile stimulators, known as tactors, are much smaller in size and more portable, although high-performance tactors with wide bandwidths, small form factors, and independently controllable vibrational frequency and amplitude are still hard to come by at a reasonable cost. The Northwestern researchers have figured out how to make transparent force sensors that can capture tactile feedback on a screen, so that they can be combined with a graphical display. “My ideal touch interface is one that can apply arbitrary forces to the finger,” says Colgate, whose team has been approaching the problem by combining friction control with small lateral motions of the screen itself. By controlling the force on the finger, the system can make parts of the screen feel “magnetic” so that a user’s finger is pulled toward them—up, down, left, right—or letting a user feel the outline of a button on the screen where none exists. Colgate’s team is also exploring how to develop devices using multiple fingers, each on a different variable friction interface. Looking ahead, Colgate believes the evolution of haptic interfaces may follow the trajectory of touch screens: a technology long in development that finally found widespread and relatively sudden acceptance in the marketplace. “The technology has to be sufficiently mature and robust, there has to be an active marketplace that creates competition and drives down costs, and it has to meet a real need.” As production costs fall and new standards emerge—as they almost certainly will—the marketplace for touch-based devices may yet come into its own. Until that happens, most of the interesting work will likely remain confined to the labs. And the future of the haptics industry seems likely to remain, well, a touchy subject. Further Reading Chubb, E.C., Colgate, J.E., and Peshkin, M.A. ShiverPaD: a glass haptic surface that produces shear force on a bare finger, IEEE Transactions on Haptics 3, 3, July–Sept., 2010. Ferris, T.K. and Sarter, N. When content matters: the role of processing code in tactile display design, IEEE Transactions on Haptics 3, 3, July–Sept., 2010. Jones, L.A. and Ho, H.-N. Warm or cool, large or small? The challenge of thermal displays, IEEE Transactions on Haptics 1, 1, Jan.–June, 2008. MacLean, K.E. Putting haptics into the ambience, IEEE Transactions on Haptics 2, 3, July–Sept., 2009. Ryu, J., Chun, J., Park, G., Choi, S., and Han, S.H. Vibrotactile feedback for information delivery in the vehicle, IEEE Transactions on Haptics 3, 2, April–June, 2010. Alex Wright is a writer and information architect who lives and works in Brooklyn, NY. Hong Z. Tan, Purdue University, contributed to the development of this article. © 2011 ACM 0001-0782/11/0100 $10.00 Obituary Watts Humphrey, Software Engineer: 1927–2010 watts humphrey, who distinguished himself as the “father of software quality engineering,” died on october 28 at age 83 at his home in Sarasota, fL. humphrey combined business practices with software development, and brought discipline and innovation to the process of designing, developing, testing, and releasing software. “watts had a profound impact on the field,” says anita Carleton, director of the 22 c ommu n ic at ion s of t h e ac m Software engineering Process management Program at the Carnegie mellon Software engineering Institute (SeI). “he was a visionary, a wonderful leader, and a wonderful man.” after receiving B.S. and m.S. degrees in physics from the University of Chicago and the Illinois Institute of technology, respectively, and an mBa from the University of Chicago, humphrey went to work at IBm. there, he headed a team that | Ja N Ua Ry 2 0 1 1 | vOL . 5 4 | N O. 1 introduced software licenses in the 1960s. humphrey focused on how disciplined and experienced professionals, working as teams, could produce high quality, reliable software within committed cost and schedule constraints. In 1986, after a 27-year career as a manager and executive at IBm, humphrey joined SeI and founded the school’s Software Process Program. he led the development of the Software Capability maturity model and eventually the Capability maturity model Integration (CmmI), a framework of software engineering best practices now used by thousands of organizations globally. humphrey was also the author of 11 books, including Managing a Software Process. an aCm and SeI fellow, he was awarded the National medal of technology in 2005. —Samuel Greengard