Robots Now Have The Ability To Feel Things

July 19, 2020 Off By Naveen Victor

Researchers at the National Institute of Singapore are inching ever closer to developing robots with advanced touch perception. The team used its newly developed artificial skin and interfaced it with Intel’s neuromorphic processor, Loihi. They say that their skin can detect touch more than 1,000 times faster than a human’s sensory nervous system.

Enabling robots to have human-like sense of touch, can lead to new use cases. Robotic arms equipped with artificial skin, could better assess the grip level needed to hold an unfamiliar object on the manufacturing floor. This way, the proper amount of pressure can be applied to said object without damaging or dropping it.

Apart from this, Intel believes that this new found technology can make human-robotic interactions safer, which will be quite beneficial to care-giving professions or automating surgical tasks. This could potentially accelerate robotic advancement in many fields, which are forced to only rely on visual sensors.

The Singapore team broke new ground in robotic perception by using its artificial skin, fitting to a robotic hand, and using it to read Braille. The sensed tactile data was sent to Loihi through the cloud to convert these micro bumps into semantic meaning. Loihi achieved over 92 percent accuracy in classifying the Braille letters.

In attempt to further improve this robotic perception, the NUS team combined visual and touch data perception into a spiking neural network. A robot fitted with artificial skin and event-based cameras were tasked with identifying various opaque containers holding differing amounts of liquid. This, would test its ability to perceive and identify rotational slippage.

Captured data was sent to a GPU and Intel’s Loihi to compare their processing abilities. The latter was 21% faster than the top-performing GPU at completing the task. And used 45 times less power while doing so. This test helped the team prove that combining visual and touch sensors, offered 10% percent improved object classification compared to a vision-only system.

The tech, once perfected could to lead to several interesting avenues, which could eventually result in robots playing more prominent roles in healthcare, bomb detection and construction. As the world continues to lean heavily on AI for future technology, its only a matter of time before we see robots begin to take over menial tasks in various sectors.