Robotics: Science and Systems XVI

Event-Driven Visual-Tactile Sensing and Learning for Robots

Tasbolat Taunyazov, Weicong Sng, Brian Lim, Hian Hian See, Jethro Kuan, Abdul Fatir Ansari, Benjamin Tee, Harold Soh

Abstract:

This work contributes an event-driven visual-tactile perception system, comprising a novel biologically-inspired tactile sensor and multi-modal spike-based learning. Our neuromorphic fingertip tactile sensor, NeuTouch, scales well with the number of taxels thanks to its event-based nature. Likewise, our Visual-Tactile Spiking Neural Network (VT-SNN) enables fast perception when coupled with event sensors. We evaluate our visual-tactile system (using the NeuTouch and Prophesee event camera) on two robot tasks: container classification and rotational slip detection. On both tasks, we observe good accuracies relative to standard deep learning methods. We have made our visual-tactile datasets freely-available to encourage research on multi-modal event-driven robot perception, which we believe is a promising approach towards intelligent power-efficient robot systems.

Download:

Bibtex:

  
@INPROCEEDINGS{Taunyazov-RSS-20, 
    AUTHOR    = {Tasbolat Taunyazov AND Weicong Sng AND Brian Lim AND Hian Hian See AND Jethro Kuan AND Abdul Fatir Ansari AND Benjamin Tee AND Harold Soh}, 
    TITLE     = {{Event-Driven Visual-Tactile Sensing and Learning for Robots}}, 
    BOOKTITLE = {Proceedings of Robotics: Science and Systems}, 
    YEAR      = {2020}, 
    ADDRESS   = {Corvalis, Oregon, USA}, 
    MONTH     = {July}, 
    DOI       = {10.15607/RSS.2020.XVI.020} 
}