intro image here
ICRA 2019 Montreal

ICRA 2019 ViTac Workshop: Integrating Vision and Touch for Multimodal and Cross-modal Perception

Abstract: Animals interact with the world through multimodal sensing inputs, especially vision and touch sensing in the case of humans. In contrast, artificial systems usually rely on a single sensing modality, with distinct hardware and algorithmic approaches developed for each modality, e.g. computer vision and tactile robotics. Future robots, as embodied agents, should make best use of all available sensing modalities to interact with the environment. Over the last few years, there have been advances in the fusing of information from distinct modalities and selecting between those modalities to use the most appropriate information for achieving a goal, e.g. grasping or manipulating an object. Furthermore, there has been a recent acceleration in the development of optical tactile sensors using cameras, such as the GelSight and TacTip tactile sensors, bridging the gap between vision and tactile sensing, and creating cross-modal perception. This workshop will encompass recent progress in the area of combining vision and touch sensing from the perspective of how touch sensing complements vision to achieve a better robot perception, exploration, learning and interaction with humans. The proposed full-day workshop aims to enhance active collaboration, discussion of methods for the fusion of vision and touch, discussion of challenges for multimodal and cross-modal sensing, development of optical tactile sensors and applications.

Organisers:
Shan Luo, University of Liverpool
Nathan Lepora, University of Bristol
Uriel Martinez Hernandez, University of Bath
João Bimbo, Istituto Italiano di Tecnologia (IIT)
Huaping Liu, Tsinghua University

Invited speakers:
● Edward Adelson (MIT)
● Peter Allen (Columbia University)
● Alberto Rodriguez (MIT)
● Oliver Kroemer (CMU)
● Lorenzo Natale (IIT)
● Vincent Hayward (UPMC Univ Paris)
● Hongbin Liu (King’s College London)
● Van Ho (Japan Adv. Institute of Science and Technology)

Accepted papers

  1. Giovanni Sutanto, Balakumar Sundaralingam, Yevgen Chebotar, Zhe Su, Ankur Handa, Nathan Ratliff and Dieter Fox. "Learning Latent Space Dynamics for Tactile Servoing"
  2. Alexander C. Abad and Anuradha Ranasinghe, "Pilot Study: Low Cost GelSight Sensor"
  3. Oliver Struckmeier, Kshitij Tiwari, Martin J. Pearson, and Ville Kyrki, "ViTa-SLAM: Biologically-Inspired Visuo-Tactile SLAM"
  4. Maria Bauza, Oleguer Canal and Alberto Rodriguez. "Tactile Mapping and Localization from High-Resolution Tactile Imprints"
  5. Hongzhuo Liang, Shuang Li, Xiaojian Ma, Norman Hendrich, Timo Gerkmann, Jianwei Zhang. "Making Sense of Audio Vibration for Liquid Height Estimation in Robotic Pouring"
  6. Teng Xue, Wenhai Liu, Mingshuo Han, Zhenyu Pan, Jin Ma, Quanquan Shao, Weiming Wang. "Bayesian Grasp:Vision based robotic stable grasp via prior tactile knowledge learning"
  7. Gyan Tatiya and Jivko Sinapov. "Sensorimotor Cross-Perception Knowledge Transfer for Grounded Category Recognition"
  8. Daniel Fernandes Gomes, Achu Wilson and Shan Luo. "GelSight Simulation for Sim2Real Learning"
  9. J. Monteiro, H. Araujo, M. Tavakoli, Amilcar Ramalho. "A Novel Sensor to Measure Surface Deformation and Contact Shape Using Stereo Vision"
  10. L.N. Vishnunandan Venkatesh, Jyothsna Padmakumar Bindu and Richard M Voyles. "Functional Inspection Using Tactile Perception during Manipulation of Deformable Objects"
  11. Gustavo Goretkin, Leslie Pack Kaelbling, Tomas Lozano-Perez. "Motion planning with visual and tactile sensing for safety in uncertain environments"
  12. Radhen Patel, Jacob Segil and Nikolaus Correll. "Reactive control of a robot hand equipped with visual-haptic sensor for pre-grasp shaping and gentle touch"
  13. Timo Korthals , Andrew Melnik , Ju ̈rgen Leitner , and Marc Hesse. "Multisensory Assisted In-hand Manipulation of Objects with a Dexterous Hand"

We thank the support from the following IEEE RAS Technical Committees: