Published January 1, 2018 | Version v1
Conference paper Open

Failure Detection Using Proprioceptive, Auditory and Visual Modalities

  • 1. Istanbul Tech Univ, Fac Comp & Informat Engn, Maslak, Turkey

Description

Handling safety is crucial to achieve lifelong autonomy for robots. Unsafe situations might arise during manipulation in unstructured environments due to noises in sensory feedback, improper action parameters, hardware limitations or external factors. In order to assure safety, continuous execution monitoring and failure detection procedures are mandatory. To this end, we present a multimodal failure monitoring and detection system to detect manipulation failures. Rather than relying only on a single sensor modality, we consider integration of different modalities to get better detection performance in different failure cases. In our system, high level proprioceptive, auditory and visual predicates are extracted by processing each modality separately. Then, the extracted predicates are fused. Experiments on a humanoid robot for tabletop manipulation scenarios indicate that the contribution of each modality is different depending on the action in execution, and multimodal fusion results in an overall performance increase in detecting failures compared to the performance attained by unimodal processing.

Files

bib-8a50345f-7471-495e-bd7a-a8ae36ccac16.txt

Files (207 Bytes)

Name Size Download all
md5:977d76296d4a0f4d373e25512cddcc45
207 Bytes Preview Download