Katherine J. Kuchenbecker is an Associate Professor in Mechanical Engineering and Applied Mechanics at the University of Pennsylvania, with a secondary appointment in Computer and Information Science. She directs the Penn Haptics Group, which is part of the General Robotics, Automation, Sensing, and Perception (GRASP) Laboratory. In this interview, she tells us about her research, which centers on the design and control of haptic interfaces for applications such as robot-assisted surgery, medical simulation, stroke rehabilitation, and personal computing.
CIRCUIT CELLAR: When did you first become interested in haptics and why did you decide to pursue it?
KATHERINE: I chose to become an engineer because I wanted to create technology that helps people. Several topics piqued my interest when I was pursuing my undergraduate degree in mechanical engineering at Stanford, including mechatronics, robotics, automotive engineering, product design, human-computer interaction, and medical devices. I was particularly excited about areas that involve human interaction with technology. Haptics is the perfect combination of these interests because it centers on human interaction with real, remote, or virtual objects, as well as robotic interaction with physical objects.
My first exposure to this field was a “haptic paddle” lab in a Stanford course on system dynamics, but that alone wouldn’t have been enough to make me fall in love with this field. Instead, it was conversations with Günter Niemeyer, the professor who advised me in my PhD at Stanford. I knew I wanted a doctorate so that I could become a faculty member myself, and I was inspired by the work he had done as an engineer at Intuitive Surgical, Inc., the maker of the da Vinci system for robotic surgery. Through my early research with Günter, I realized that it is incredibly satisfying to create computer-controlled electromechanical systems that enable the user to touch virtual objects or control a robot at a distance. I love demonstrating haptic systems because people make such great faces when they feel how the system responds to their movements. Another great benefit of studying haptics is that I get to work on a wide variety of applications that could potentially impact people in the near future: robotic surgery, medical training, stroke rehabilitation, personal robotics, and personal computing, to name a few.
CIRCUIT CELLAR: What is haptography? What are its benefits?
KATHERINE: I coined the term “haptography” (haptic photography) to proclaim an ambitious goal for haptics research: we should be able to capture and reproduce how surfaces feel with the same acuity that we can capture and reproduce how surfaces look.
— ADVERTISMENT—
—Advertise Here—When I entered the field of haptics in 2002, a lot of great research had been done on methods for letting a user feel a virtual three-dimensional shape through a stylus or thimble. Essentially, the user holds on to a handle attached to the end of a lightweight, back-drivable robot arm; the 3D Systems Touch device is the most recent haptic interface of this type. A computer measures the motion that the person makes and constantly outputs a three-dimensional force vector to give the user the illusion that they are touching the object shown on the screen. I was impressed with the haptics demonstrations I tried back in 2002, but I was also deeply disappointed with how the virtual surfaces felt. Everything was soft, squishy, and indistinct compared to how real objects feel. That’s one of the benefits of being new to a field; you’re not afraid to question the state of the art.
I started working to improve this situation as a doctoral student, helping invent a way to make hard virtual surfaces like wood and metal feel really hard and realistic. The key was understanding that the human haptic perceptual system keys in on transients instead of steady-state forces when judging hardness. I had to write a research statement to apply for faculty positions at the end of 2005, so I wrote all about haptography. Rather than trying to hand-program how various surfaces should feel, I wanted to make it all data driven. The idea is to use motion and force sensors to record everything a person feels when using a tool to touch a real surface. We then analyze the recorded data to make a model of how the surface responds when the tool moves in various ways. As with hardness, high-frequency vibration transients are also really important to human perception of texture, which is a big part of what makes different surfaces feel distinct. Standard haptic interfaces weren’t designed to output high-frequency vibrations, so we typically attach a voice-coil actuator (much like an audio speaker) to the handle, near the user’s fingertips. When the user is touching a virtual surface, we output data-driven tapping transients, friction forces, and texture vibrations to try to fool them into thinking they are touching the real surface from which the model was constructed.
After many years of research by my PhD students Heather Culbertson and Joe Romano, we’ve been able to create the most realistic haptic surfaces in the world. My work in haptography is motivated by a belief that there are myriad applications for highly realistic haptic virtual surfaces.
One exciting use is in recording what doctors and other clinical practitioners feel as they use various tools to care for their patients, such as inserting an epidural needle or examining teeth for decay (more on this below). Haptography would enable us to accurately simulate those interactions so that trainees can practice critical perceptualmotor skills on a computer model instead of on a human patient.
Another application that excites us is adding tactile feedback to online shopping. We’d love to use our technology to let consumers feel the fabrics and surfaces of products they’re considering without having to visit a physical store. Touch-mediated interaction plays an important role in many facets of human life; I hope that my team’s work on haptography will help bring highly realistic touch feedback into the digital domain.
Read Circuit Cellar’s interviews with other engineers, academics, and innovators.
CIRCUIT CELLAR: Which of the Penn Haptics Group’s projects most interest you at this time?
KATHERINE: That’s a hard question! I’m excited about all of the projects we are pursuing. There are a few I can’t talk about, because we’re planning to patent the underlying technology once we confirm that it works as well as we think it does. Two of those that are in the public domain have been fascinating me recently. Tactile Teleoperation: My lab shares a Willow Garage PR2 (Personal Robot 2) humanoid robot with several of the other faculty in Penn’s GRASP Lab. Our PR2’s name is Graspy.
This wearable device allows the user to control the motion of the PR2 robot’s hand and also feel what the PR2 is feeling. The haptic feedback is delivered via a geared DC motor and two voice-coil actuators.
While we’ve done lots of fun research to enable this robot to autonomously pick up and set down unknown objects, I’d always dreamed of having a great system for controlling Graspy from a distance. Instead of making the operator use a joystick or a keyboard, we wanted to let him or her control Graspy using natural hand motions and also feel what Graspy was feeling during interactions with objects.
— ADVERTISMENT—
—Advertise Here—My PhD student Rebecca Pierce recently led the development of a wearable device that accomplishes exactly this goal. It uses a direct drive geared DC motor with an optical encoder to actuate and sense a revolute joint that is aligned with the base joint of the operator’s index finger. Opening and closing your hand opens and closes the robot’s paralleljaw gripper, and the motor resists the motion of your hand if the robot grabs onto something. We supplement this kinesthetic haptic feedback with tactile feedback delivered to the pads of the user’s index finger and thumb. A voice coil actuator mounted in each location moves a platform into and out of contact with the finger to match what the robot’s tactile sensors detect. Each voice coil presses with a force proportional to what the corresponding robot finger is feeling, and the voice coils also transmit the high-frequency vibrations (typically caused by collisions) that are sensed by the MEMS-based accelerometer embedded in the robot’s hand. We track the movement of this wearable device using a Vicon optical motion tracking system, and Graspy follows the movements of the operator in real time. The operator sees a video of the interaction taking place. We’re in the process of having human participants test this teleoperation setup right now, and I’m really excited to learn how the haptic feedback affects the operator’s ability to control the robot.
The high-bandwidth MEMS-based accelerometer records thesensations a dentist feels as she probes an extracted human tooth. Feeling these recordings lets dental trainees practice diagnosing dental decay before they treat live patients.
CIRCUIT CELLAR: In your TEDYouth talk, you describe a project in which a dental tool is fitted with an accelerometer to record what a dentist feels and then replay it back for a dental student. Can you tell us a bit about the project?
KATHERINE: This project spun out of my haptography research, which I described above. While we were learning to record and model haptic data from interactions between tools and objects, we realized that the original recordings had value on their own, even before we distilled them into a virtual model of what the person was touching. One day I gave a lab tour to two faculty members from the Penn School of Dental Medicine who were interested in new technologies. I hit it off with Dr. Margrit Maggio, who had great experience in teaching general dentistry skills to dental students. She explained that some dental students really struggled to master some of the tactile judgments needed to practice dentistry, particularly in discerning whether or not a tooth surface is decayed (in popular parlance, whether it has a cavity). A few students and I went over to her lab to test whether our accelerometer-based technology could capture the subtle details of how decayed vs. healthy tooth tissue feels. While the recordings are a little creepy to feel, they are super accurate. We refined our approach and conducted several studies on the potential of this technology to be used in training dental students. The results were really encouraging, once again showing the potential that haptic technology holds for improving clinical training.
CIRCUIT CELLAR: What is the “next big thing” in the field of haptics? Is there a specific area or technology that you think will be a game changer?
KATHERINE: Of course this depends on where you’re looking. While cell phones and game controllers have had vibration alerts for a long time, I think we’re just starting to see highquality haptic feedback emerge in consumer products. Haptics can definitely improve the user experience, which will give haptic products a market advantage, but their cost and implementation complexity need to be low enough to keep the product competitive. On the research side, I’m seeing a big move toward tactile feedback and wearable devices. Luckily there are enough interesting open research questions to keep my students and me busy for 30 more years, if not longer!
The complete interview appears in Circuit Cellar 296 (March 2015).
Circuit Cellar's editorial team comprises professional engineers, technical editors, and digital media specialists. You can reach the Editorial Department at editorial@circuitcellar.com, @circuitcellar, and facebook.com/circuitcellar