Editor's Letter Insights

How Does “It” Feel?

Written by Jeff Child

Don’t worry. I’m not trying to hurl lyrics from classic Bob Dylan songs at you. Recently, while researching my feature this month about technologies for robotics applications, I asked some of the companies I talked to about the state-of-art for a robot’s ability to sense what it’s touching—in other words, to “feel” things the way human skin does. I was pointed to some recent research announced by Intel.

 In mid-July Intel announced that two researchers from the National University of Singapore (NUS), who are members of the Intel Neuromorphic Research Community (INRC), presented new findings demonstrating the promise of event-based vision and touch sensing in combination with Intel’s neuromorphic processing for robotics. According to Intel, the work highlights how bringing a sense of touch to robotics can significantly improve capabilities and functionality compared to today’s visual-only systems. It also demonstrates that neuromorphic processors can outperform traditional architectures when processing such sensory data.

 Intel launched the INRC in 2018 as an ecosystem of academic groups, government labs, research institutions and companies around the world working with Intel to further neuromorphic computing and develop AI applications. Neuromorphic systems replicate the way neurons are organized, communicate and learn at the hardware level. The group’s goal is to apply the latest insights from neuroscience to create chips that function less like traditional computers and more like the human brain.

 Most of today’s robots operate solely on visual processing. Intel says that researchers at NUS hope to change this using their recently developed artificial skin, which according to their research can detect touch more than 1,000 times faster than the human sensory nervous system and identify the shape, texture and hardness of objects 10 times faster than the blink of an eye.

 Implementing a human-like sense of touch in robotics could significantly improve current functionality and even lead to new use cases. For example, robotic arms fitted with artificial skin could easily adapt to changes in goods manufactured in a factory, using tactile sensing to identify and grip unfamiliar objects with the right amount of pressure to prevent slipping. The ability to feel and better perceive surroundings could also allow for closer and safer human-robotic interaction, such as in caregiving professions, or bring us closer to automating surgical tasks by giving surgical robots the sense of touch that they lack today.

 To make all this happen requires a chip that can draw accurate conclusions based on the skin’s sensory data in real time, while operating at a power level efficient enough to be deployed directly inside the robot. The team at NUS began exploring the potential of neuromorphic technology to process sensory data from the artificial skin using Intel’s Loihi neuromorphic research chip. In their initial experiment, the researchers used a robotic hand fitted with the artificial skin to read Braille, passing the tactile data to Loihi through the cloud to convert the micro bumps felt by the hand into a semantic meaning.

— ADVERTISMENT—

Advertise Here

 Loihi achieved over 92% accuracy in classifying the Braille letters, while using 20 times less power than a standard Von Neumann processor. Building on this, the NUS team further improved robotic perception capabilities by combining both vision and touch data in a spiking neural network. To do so, they tasked a robot to classify various opaque containers holding differing amounts of liquid using sensory inputs from the artificial skin and an event-based camera.

 Once this sensory data was captured, the team sent it to both a GPU and Intel’s Loihi neuromorphic research chip to compare processing capabilities. The results, which were presented at Robotics: Science and Systems in July, show that combining event-based vision and touch using a spiking neural network enabled 10% greater accuracy in object classification compared to a vision-only system. They also demonstrated the promise for neuromorphic technology to power such robotic devices, with Loihi processing the sensory data 21% faster than a top-performing GPU, while using 45 times less power.

 As robots gain the ability to “feel” more like humans can, I guess we’ll need to start asking: “How does he/ she feel?”  

PUBLISHED IN CIRCUIT CELLAR MAGAZINE• SEPTEMBER 2020 #362- Get a PDF of the issue


Don't miss out on upcoming issues of Circuit Cellar. Subscribe today!

 
 
Note: We’ve made the October 2017 issue of Circuit Cellar available as a free sample issue. In it, you’ll find a rich variety of the kinds of articles and information that exemplify a typical issue of the current magazine.


Would you like to write for Circuit Cellar? We are always accepting articles/posts from the technical community. Get in touch with us and let's discuss your ideas.

Editor-in-Chief at Circuit Cellar | Website | + posts

Jeff Child has more than 28 years of experience in the technology magazine business—including editing and writing technical content, and engaging in all aspects of magazine leadership and production. He joined the Circuit Cellar after serving as Editor-in-Chief of COTS Journal for over 10 years. Over his career Jeff held senior editorial positions at several of leading electronic engineering publications, including EE Times and Electronic Design and RTC Magazine. Before entering the world of technology journalism, Jeff worked as a design engineer in the data acquisition market.