ICRA 2024 Workshop on “Robot Embodiment through Visuo-Tactile Perception”
This is the official website for “ViTac 2024: Robot Embodiment through Visuo-Tactile Perception”.
Abstract: This is the 5th time for us to organise the ViTac workshop in the ICRA conferences, after ICRA 2019, 2020, 2021 and 2023 ViTac Workshops. It will also be timely for us to report the progress of the Special Issue (now called Special Collection) we are organising on Tactile Robotics at the IEEE Transactions on Robotics. The recent trend in embodied intelligence, i.e., intelligence that requires and leverages a physical body of a robot, demands rich physical interactions with the surrounding environment, which cannot be achieved without perception of the environment with vision and tactile sensing. Recent years have seen rapid advancements in tactile sensor development, integration of information from visual and tactile modalities, leveraging vision and tactile sensing in agile grasping and manipulation, and simulations that involve both vision and tactile sensing. It will be timely to bring together the experts and young researchers in the field to discuss these important topics in robotics. This proposed full-day workshop will cover the recent advancements in the area of visuo-tactile sensing and perception, with the aim to empower robots with visual and tactile intelligence. It will further enhance active collaboration and address challenges for this important topic and applications.
In our one-day workshop, we will discuss topics relating to visuo-tactile perception, from the development of tactile sensors, to integration of vision and touch for robot physical interaction tasks, to the creation of simulations of tactile sensors, and Sim2Real learning for visuo-tactile perception. The workshop will bring together experts from a diverse range of disciplines and encompass engineers, computer scientists, cognitive scientists and sensor developers to discuss the important topic. We will report the progress of the Special Issue (now called Special Collection) we are organising on Tactile Robotics at the IEEE Transactions on Robotics and invite the authors of the accepted papers in our special issue to present their works in our workshop. As part of the workshop, we will also organise a Challenge named ManiSkill-ViTac to benchmark the use of vision and tactile sensing in robot manipulation tasks.
For past ViTac workshops, please check ViTac2023, ViTac2021, ViTac2020, ViTac2019.
Time: Full day, 17th May 2024
Location: G411 (North area), PACIFICO Yokohama (Hybrid), Yokohama, Japan.
Location: The afternoon session has been updated, please check.
(Tentative) Program
09:00-09:10 Organisers - Shan Luo: Welcome & updates on TRO Special Collection
Session 1: Development of touch sensors - Hardware intelligence
09:10-09:20 Nathan Lepora & Shan Luo: Session Intro
09:20-09:40 Ravinder Dahiya: “Movement Detection using Self-Powered e-Skin”
09:40-10:00 Antonio Bicchi: “From optic flow to tactile flow. Background and new results”
10:00-10:30 Coffee break, posters and demos
Session 2: Robot embodiment with vision and tactile sensing
10:30-10:40 Wenzhen Yuan & Shan Luo: Session Intro
10:40-11:00 Huaping Liu: “Large-scale tactile perception for interaction and manipulation”
11:00-11:20 Van Anh Ho: “How to make robot “feel” safe?: Embodiment in robot safety with vision-based multimodal sensing”
11:20-12:00 Poster presenters Lightning poster presentations (2mins each)
12:00-13:30 Lunch break, posters and demos
The afternoon session will start at 13:30 (30 mins earlier than planned) due to the end of conference by 17:00.
Session 3: Challenges in visual-tactile perception
13:30-13:40 Rui Chen & Hao Su Session: intro
13:40-14:00 Mark Cutkosky: Invited talk
14:00-14:20 Roberto Calandra: “Towards General In-hand Manipulation with Touch”
14:20-14:40 ManiSkill-ViTac Challenge Conclusion of ManiSkill-ViTac Challenge, Award Ceremony
14:40-15:00 ManiSkill-ViTac Challenge Invited talk from challenge winners
15:00-15:30 Coffee break, posters and demos
Session 4 The future of robot embodiment with visual-tactile perception
15:30-15:40 Kaspar Althoefer & Gordon Cheng: Session intro
15:40-16:00 Perla Maiolino: “Proxy-tactile technology and tactile data generation for robot self-awareness and control”
16:00-16:20 Maria Bauza Villalonga: “Visuo-tactile sensing for precise robotic manipulation with multiple embodiments”
16:20-16:30 Room rearrangment for the group discussion
16:30-17:00 All speakers: Group discussion
17:00 Finish
Accepted papers
- Zeqing Zhang, Guangze Zheng, Xuebo Ji, Guanqi Chen, Ruixing Jia, Wentao Chen, Guanhua Chen, Liangjun Zhang and Jia Pan. "MAE4GM: Visuo-Tactile Learning for Property Estimation of Granular Material using Multimodal Autoencoder"
- Emanuele Aucone, Carmelo Sferrazza, Manuel Gregor, Raffaello D’Andrea, and Stefano Mintchev. "Optical Tactile Sensing for Multi-Contact Interaction on Aerial Robots"
- Osher Azulay and Avishai Sintov. "AllSight: Advancing Optical Tactile Sensing and Sim-to-Real Learning for Dexterous Robotic Manipulation"
- Julia Di, Zdravko Dugonjic, Will Fu, Tingfan Wu, Romeo Mercado, Kevin Sawyer, Victoria Rose Most, Gregg Kammerer, Stefanie Speidel, Richard E. Fan, Geoffrey Sonn, Mark R. Cutkosky, Mike Lambeta, and Roberto Calandra. "DIGIT Pinki: Using Fiber Optic Bundles to Miniaturize Vision-Based Tactile Sensors"
- Yijiong Lin, Alex Church, Max Yang, Haoran Li, John Lloyd, Dandan Zhang, Nathan F. Lepora. "Bi-Touch: Bimanual Tactile Manipulation with Sim-to-Real Deep Reinforcement Learning"
- Hao Zhou, Masahiro Miyazaki and Kazuhiro Shimonomura. "NailTact: Vision-based Tactile Sensing in Both Fingerpad and Nail"
- Chenghua Lu, Nathan F. Lepora. "DexiTac: A Reconfigurable Gripper with Tactile Sensing Ability"
- Giuseppe Vitrani and Michae ̈l Wiertlewski. "Predicting object slippage in robotic grippers using human-inspired tactile processing"
- Boyi Duan, Kun Qian, Yongqiang Zhao, Dongyuan Zhang, Shan Luo. "Feature-level Sim2Real Regression of Tactile Images for Robot Manipulation"
- Haoran Li, Saekwang Nam, Zhenyu Lu, Chenguang Yang, Efi Psomopoulou, Nathan F. Lepora. "BioTacTip: A Soft Biomimetic Optical Tactile Sensor for Efficient 3D Contact Localization and 3D Force Estimation"
- Bo Ai, Stephen Tian, Haochen Shi, Yixuan Wang, Cheston Tan, Yunzhu Li, Jiajun Wu. "RoboPack: Learning Tactile-Informed Dynamics Models for Dense Packing"
- Max Yang, Chenghua Lu, Alex Church, Yijiong Lin, Chris Ford, Haoran Li, Efi Psomopoulou, David A.W. Barton, Nathan F. Lepora. "AnyRotate: Gravity-Invariant In-Hand Rotation with Sim-to-Real Touch"
- Quan Khanh Luu, and Van Anh Ho. "Soft Robotic Link with Controllable Transparency for Vision-based Tactile and Proximity Sensing"
- Takeshi Tomomizu, Quan Khanh Luu, Nhan Huu Nguyen, and Van Anh Ho. "Preliminary Design of Vision-Based Tactile Sensor with Nail and Soft Structures"
- Zhuo Chen, Ni Ou, Jiaqi Jiang and Shan Luo. "Deep Domain Adaptation Regression for Force Calibration of Optical Tactile Sensors"
- Yuni Fuchioka and Masashi Hamaya. "In-Grasp Torque Estimation for Visuotactile Sensors with Tactile Dipole Moments"
- Xuyang Zhang, Jiaqi Jiang, and Shan Luo. "RoTip: A Finger-Shaped Tactile Sensor with Active Rotation"
- Tunwu Li, Xiaolong Li, Zhenyu Lu and Chenguang Yang. "A Tactile Sensor Roller for In-Process Inspection of Composites"
- Jieji Ren, Yueshi Dong, Yuru Gong, Ningbin Zhang, Jiang Zou and Guoying Gu. "Soft Camera-based Tactile Sensor for Compliant Grasping and Manipulation"
- Abdallah Ayad, Adrian Ro ̈fer, Nick Heppert, Abhinav Valada. "Imagine2touch: Predictive Tactile Sensing for Robotic Manipulation using Efficient Low-Dimensional Signals"
- Zhongyue Wu and Maciej Wozniak. "Enhanced AR: Integrating Haptic Feedback from Tactile Sensors for Immersive Teleportation"
- Lowiek Van den Stockt, Remko Proesmans and Francis wyffels. "Automatic Calibration for an Open-source Magnetic Tactile Sensor"
- Noah Becker, Erik Gattung, Kay Hansel, Tim Schneider, Yaonan Zhu, Yasuhisa Hasegawa, and Jan Peters. "Integrating Visuo-tactile Sensing with Haptic Feedback for Teleoperated Robot Manipulation"
- Eric T. Chang, Peter Ballentine, Ioannis Kymissis and Matei Ciocarlie. "Development Towards a PVDF-Based Tactile Finger with Distributed Vibration Sensing"
- Toru Lin, Yu Zhang, Qiyang Li, Haozhi Qi, Brent Yi, Sergey Levine, and Jitendra Malik. "Learning Visuotactile Skills with Two Multifingered Hands"
- Daniel Palenicek, Theo Gruner, Tim Schneider, Alina Bo ̈hm, Janis Lenz, Inga Pfenning, Eric Kra ̈mer† and Jan Peters. "Learning Tactile Insertion in the Real World"
- Guillaume Duret, Florence Zara, Jan Peters and Liming Chen. "Toward synthetic data generation for robotic tactile manipulations"
Organisers:
Shan Luo, King’s College London
Nathan Lepora, University of Bristol
Wenzhen Yuan, University of Illinois Urbana-Champaign
Rui Chen, Tsinghua University
Hao Su, University of California San Diego
Kaspar Althoefer, Queen Mary University of London
Gordon Cheng, Technische Universität München
Invited speakers:
Antonio Bicchi, Italian Institute of Technology and the University of Pisa, Italy
Mark Cutkosky, Stanford University, USA
Roberto Calandra, Technische Universität Dresden, Germany
Huaping Liu, Tsinghua University, China
Perla Maiolino, University of Oxford, UK
Ravinder Dahiya, Northeast University, USA
Van Anh Ho, JAIST, Japan
Maria Bauza Villalonga, DeepMind
Key dates
Posters and live demonstrations will be selected from a call for extended abstracts, reviewed by the organisers. The best posters will be invited to talk at the workshop. All submissions will be reviewed using a single-blind review process. Accepted contributions will be presented during the workshop as posters.
Expected contributions should be submitted in the form of extended abstracts (max 2 pages + references) in IEEE Conference paper format. Submissions must be sent in pdf (<5MB), to: shan.luo@kcl.ac.uk, indicating [ICRA 2024 Workshop] in the email subject.
Submission Deadline: 15th March 30th March, 2024
Notification of acceptance: 30th March 15th April, 2024
Camera-ready deadline: 15th April 30th April, 2024
Workshop day: 17th May 2024
The accepted papers will be invited to publish their accepted extended abstracts in Springer Nature conference proceedings series Lecture Notes in Computer Science (LNCS).
Topics of Interest:
● Development of tactile sensors for robot tasks
● Simulation of tactile sensors and Sim2Real learning
● Visual-tactile sensing and perception
● The control and design of robotic visuo-tactile systems for human-robot interaction
● Interactive visuo-tactile perception in robot grasping and manipulation
● Cognitive control of movement and sensory-motor representations with vision and touch
● Bio-inspired approaches and cognitive architectures for the visuo-tactile perception
● Computational methods for processing vision and touch data in robot learning
We thank the support from the following IEEE RAS Technical Committees:
- Haptics
- Cognitive Robotics
- Human-Robot Interaction & Coordination