Researchers present new findings demonstrating the promise of event-based vision and touch sensing in combination with Intel’s neuromorphic processing for robotics.
Today, robots are only able to operate on visual processing and do not have the sensitivity to feel between surfaces. Team of researchers from the National University of Singapore (NUS), who are members of the Intel Neuromorphic Research Community (INRC), have now renewed hope that it will be possible for robots to be able to detect touch faster than the human sensory nervous system. Their recently developed artificial skin is also able to identify the shape, texture and hardness of objects 10 times faster than the human senses.
The NUS team began exploring the potential of neuromorphic technology to process sensory data from the artificial skin using Intel’s Loihi neuromorphic research chip. In their initial experiment, the researchers used a robotic hand fitted with the artificial skin to read Braille, passing the tactile data to Loihi through the cloud to convert the micro bumps felt by the hand into a semantic meaning. This experiment achieved 92 percent accuracy in classification of Braille letters, while using 20 times less power than a standard Von Neumann processor.
“This research from National University of Singapore provides a compelling glimpse to the future of robotics where information is both sensed and processed in an event-driven manner combining multiple modalities. The work adds to a growing body of results showing that neuromorphic computing can deliver significant gains in latency and power consumption once the entire system is re-engineered in an event-based paradigm spanning sensors, data formats, algorithms, and hardware architecture.” Said Mike Davies, director of Intel’s Neuromorphic Computing Lab
Further expanding on the research, the team looks to further improve robotic perception capabilities by combining both vision and touch data in a spiking neural network. The resulting data presented at Robotics: Science and Systems demonstrated a combination of event-based vision and touch using spiking neural network enabled 10 percent greater accuracy in object classification compared to vision-only.
They also demonstrated the promise for neuromorphic technology to power such robotic devices, with Loihi processing the sensory data 21 percent faster than a top-performing GPU, while using 45 times less power.
“We’re excited by these results. They show that a neuromorphic system is a promising piece of the puzzle for combining multiple sensors to improve robot perception. It’s a step toward building power-efficient and trustworthy robots that can respond quickly and appropriately in unexpected situations,” said assistant professor Harold Soh from the Department of Computer Science at the NUS School of Computing.
The work highlights how bringing a sense of touch to robotics can significantly improve capabilities and functionality compared to today’s visual-only systems and how neuromorphic processors can outperform traditional architectures in processing such sensory data. [APBN]