ICRA 2023 ViTac Workshop: Blending Virtual and Real Visuo-Tactile Perception

intro image here
ICRA 2023 London

This is the official website for “ViTac 2023: Blending Virtual and Real Visuo-Tactile Perception”.

Abstract: This is the 4th time for us to organise the ViTac workshop in the ICRA conferences, after ICRA 2019, 2020 and 2021 ViTac Workshops, and will be the first time for us to have a hybrid ViTac workshop. The past years have witnessed the fast development of simulation models of optical tactile sensors, Sim2Real and Sim2Real2Sim learning for visuo-tactile perception. It will be timely to bring together the experts and young researchers in the field to discuss how to blend virtual and real visuo-tactile perception. This proposed full-day workshop will cover the recent advancements in the area of visuo-tactile sensing and perception, with the aim to bridge the gap between the simulations and real world for optical tactile sensing and robot perception with vision and tactile sensing. It will further enhance active collaboration and address challenges for this important topic and applications.

Time: 9:00-18:00 (London time) Friday 2nd June 2023 (access to the room starts from 8:00)
Location: ICC Capital Suite 6, xCeL London, 1 Western Gateway, London E16 1XL

Hybrid format: All presentations will be streamed to our online participants through the InfoVaya Conference App and hosted on Zoom, click this link to access the Zoom meeting. Virtual attendees will be able to observe the session and submit questions through the Zoom chat function. One of the organisers will monitor the questions from remote participants and field them in the workshop room. Prospective questions and discussion topics for the group discussion will be collected all day from both in-presence and online participants. Posters will be displayed all day and will also be accessible in digital format for those attending online.

News: We are seeking top-quality original articles for a Special Issue on Tactile Robotics that we are organising in the IEEE Transactions on Robotics.

The workshop will be hybrid, please follow our customised ICRA ViTac Workshop YouTube channel, a legacy of ICRA 2020 and 2021 ViTac workshops, for recordings of the ViTac workshops.

News: The recording of the workshop is availabe in this link.

group image here
ICRA 2023 London

Organisers:
Shan Luo, King’s College London
Nathan Lepora, University of Bristol
Wenzhen Yuan, Carnegie Mellon University
Kaspar Althoefer, Queen Mary University of London
Gordon Cheng, Technische Universität München

Invited speakers:
Ted Adelson, MIT
Dieter Fox, Nvidia / University of Washington
Ravinder Dahiya, Northeastern University
Roberto Calandra, TU Dresden
Michael Yu Wang, Monash University
Katherine Kuchenbecker, Max Planck Institute for Intelligent Systems
Lucia Beccai, Istituto Italiano di Tecnologia
Rich Walker, Shadow Robot Company

Key dates
Posters and live demonstrations will be selected from a call for extended abstracts, reviewed by the organisers. The best posters will be invited to talk at the workshop. All submissions will be reviewed using a single-blind review process. Accepted contributions will be presented during the workshop as posters. Submissions must be sent in pdf, following the IEEE conference style (two-columns), to: shan.luo@kcl.ac.uk, indicating [ICRA 2023 Workshop] in the email subject.
● Submission Deadline: 15th March, 2023 30th March, 2023
● Notification of acceptance: 10th April, 2023
● Camera-ready deadline: 20th April, 2023
● Workshop day: 2nd June 2023

Topics of Interest:
● Development of tactile sensors for robot tasks
● Simulation of tactile sensors
● Sim2Real and Sim2Real2Sim learning for visual-tactile sensing and perception
● The control and design of robotic visuo-tactile systems for human-robot interaction
● Interactive visuo-tactile perception in robot grasping and manipulation
● Cognitive control of movement and sensory-motor representations with vision and touch
● Bio-inspired approaches and cognitive architectures for the visuo-tactile perception
● Computational methods for processing vision and touch data in robot learning

Program (may subject to minor changes)
09:00 Organizers Welcome

Session 1: Development of touch sensors
09:20 Gordon Cheng: Session Intro
09:30 Ravinder Dahiya: Invited talk “Electronic skin with printed temperature and light detection sensors”
09:50 Lucia Beccai: Invited talk “The elephant trunk as a new model of tactile perception”
10:10 Poster presenters - 1 Lightning poster presentations (2mins each)
Poster 1: Eric T. Chang, Peter Ballentine, Ioannis Kymissis, Matei Ciocarlie. Towards Development of a Signal-Dense Multimodal Tactile Finger [PDF]
Poster 2: Max Yang, Yijiong Lin, Alex Church, John Lloyd, Dandan Zhang, David A.W. Barton, Nathan F. Lepora. Generalizable and Robust Tactile Pushing using Sim-to-Real Deep Reinforcement Learning [PDF]
Poster 3: Saekwang Nam, Loong Yi Lee, Max Yang, Naoki Shitanda, Jonathan Rossiter and Nathan Lepora. Vision and Tactile Pose Identification for Picking a Target without Collision [PDF]
Poster 4: Christopher J. Ford, Haoran Li, John Lloyd, Manuel G. Catalano, Matteo Bianchi, Efi Psomopoulou, Nathan F. Lepora. Tactile-Driven Gentle Grasping for Human-Robot Collaborative Tasks [PDF]
Poster 5: Xingshuo Jing, Yongqiang Zhao, Jiaqi Jiang, Boyi Duan, Kun Qian, Shan Luo. Unsupervised Adversarial Domain Adaptation for Sim-to-Real Transfer of Tactile Manipulation Skills [PDF]
Poster 6: Mauro Comi, Alex Church, Kejie Li, Laurence Aitchison, Nathan F. Lepora. Implicit Neural Representation for 3D Shape Reconstruction Using Vision-Based Tactile Sensing [PDF]
Poster 7: Julio Castano-Amoros and Pablo Gil. Rotational Slippage Prediction from Segmentation of Tactile Images [PDF]
10:30 Coffee break, posters and demos

Session 2: Simulations of tactile sensors
11:30 Shan Luo: Session Intro & Talk “Visuo-Tactile Perception in the Real and Virtual Worlds”
11:50 Michael Yu Wang: Invited talk “Touch and Sense for Learning Dexterous Manipulation”
12:10 Poster presenters - 2 Lightning poster presentations (2mins each)
Poster 8: Zixi Chen, Shixin Zhang, Yuhao Sun, Shan Luo, Fuchun Sun, and Bin Fang. Plasticine Manipulation Simulation with Optical Tactile Sensing [PDF]
Poster 9: Yijiong Lin, Mauro Comi, Alex Church, Dandan Zhang, Nathan F. Lepora. Attention of Robot Touch: Tactile Saliency Prediction for Robust Sim-to-Real Tactile Control [PDF]
Poster 10: Daniel Gomes and Paolo Paoletti and Shan Luo. Simulation of Curved GelSight Sensors for Sim2Real Learning [PDF]
Poster 11: Guanqun Cao, Jiaqi Jiang, Ningtao Mao, Danushka Bollegala, Min Li, and Shan Luo. Vis2Hap: Vision-based Haptic Rendering by Cross-modal Generation [PDF]
Poster 12: Jiaqi Jiang, Guanqun Cao, Aaron Butterworth, Thanh-Toan Do, and Shan Luo. Vision-Guided Tactile Poking for Transparent Object Grasping [PDF]
Poster 13: Abu Bakar Dawood, Brice Denoun, and Kaspar Althoefer. Optical Tomography-based Soft Sensor Skin [PDF]
Poster 14: Carolina Higuera, Byron Boots, and Mustafa Mukadam. Learning to Read Braille: Bridging the Tactile Reality Gap with Diffusion Models [PDF]

12:30 Lunch break, posters and demos

Session 3: Sim2Real learning for visuo-tactile perception
14:00 Nathan Lepora: Session Intro & Talk “Progress in real, simulated and sim2real optical tactile sensing”
14:20 Katherine Kuchenbecker: Invited talk “Minsight: A Fingertip-Sized Vision-Based Tactile Sensor for Robotic Manipulation”
14:40 Roberto Calandra: Invited talk “What’s next for Vision-based Tactile Sensors?”
15:00 Ted Adelson: Invited talk “Advances in Camera Based Tactile Sensing” Abstract: Camera based tactile sensors such as GelSight are becoming increasingly popular. The classic boxy design is limiting, especially when integrating the sensor into a gripper intended for grasping and manipulation. I will describe a variety of recent progress, including hemispherical and ellipsoidal sensors, finger-like sensors, fin-ray effect sensors, roller sensors, and articulated sensors. These new designs allow high resolution sensing of geometry and force, and in some cases also capture proprioceptive information.
15:20 Rich Walker: Talk & demo from Shadow Robot
15:40 Coffee break, posters and demos

Session 4 Challenges and Outlook
16:30 Wenzhen Yuan: Session Intro & Talk “Bridging simulated and the real-world tactile sensors: challenges and outlook”
16:50 Dieter Fox: Invited talk “Sim2Real for Contact Rich Tasks”
17:10 All speakers Group discussion
18:00 Finish

Accepted papers

  1. Eric T. Chang, Peter Ballentine, Ioannis Kymissis, Matei Ciocarlie. "Towards Development of a Signal-Dense Multimodal Tactile Finger"
  2. Max Yang, Yijiong Lin, Alex Church, John Lloyd, Dandan Zhang, David A.W. Barton, Nathan F. Lepora, "Generalizing Object Pushing Using Touch: A Sim-to-Real Deep Reinforcement Learning Approach"
  3. Saekwang Nam, Loong Yi Lee, Max Yang, Naoki Shitanda, Jonathan Rossiter, Nathan Lepora, "Vision and Tactile Pose Identification for Picking a Target without Collision"
  4. Christopher J. Ford, Haoran Li, John Lloyd, Manuel G. Catalano, Matteo Bianchi, Efi Psomopoulou, Nathan F. Lepora, "Tactile-Driven Gentle Grasping for Human-Robot Collaborative Tasks"
  5. Xingshuo Jing, Yongqiang Zhao, Jiaqi Jiang, Boyi Duan, Kun Qian, Shan Luo. "Unsupervised Adversarial Domain Adaptation for Sim-to-Real Transfer of Tactile Manipulation Skills"
  6. Mauro Comi, Alex Church, Kejie Li, Laurence Aitchison, Nathan F. Lepora. "Implicit Neural Representation for 3D Shape Reconstruction Using Vision-Based Tactile Sensing"
  7. Julio Castan ̃o-Amoro ́s and Pablo Gil. "Rotational Slippage Prediction from Segmentation of Tactile Images"
  8. Zixi Chen, Shixin Zhang, Yuhao Sun, Shan Luo, Fuchun Sun, and Bin Fang. "Plasticine Manipulation Simulation with Optical Tactile Sensing"
  9. Yijiong Lin, Mauro Comi, Alex Church, Dandan Zhang, Nathan F. Lepora. "Attention of Robot Touch: Tactile Saliency Prediction for Robust Sim-to-Real Tactile Control"
  10. Daniel Gomes and Paolo Paoletti and Shan Luo. "Simulation of Curved GelSight Sensors for Sim2Real Learning"
  11. Guanqun Cao, Jiaqi Jiang, Ningtao Mao, Danushka Bollegala, Min Li, and Shan Luo. "Vis2Hap: Vision-based Haptic Rendering by Cross-modal Generation"
  12. Jiaqi Jiang, Guanqun Cao, Aaron Butterworth, Thanh-Toan Do, and Shan Luo. "Vision-Guided Tactile Poking for Transparent Object Grasping"
  13. Abu Bakar Dawood, Brice Denoun, and Kaspar Althoefer. "Data-Driven Optical Tomography-based Soft Skin Sensor"
  14. Carolina Higuera, Byron Boots, and Mustafa Mukadam. "Learning to Read Braille: Bridging the Tactile Reality Gap with Diffusion Models"

We thank the support from the following IEEE RAS Technical Committees: