ETRI Launches XR Telepresence Tech To Enable Real-Time Handshakes And Lifelike Remote Meetings

By combining haptic gloves, generative AI, and immersive XR environments, ETRI is redefining remote collaboration, turning video calls into lifelike meetings where participants can truly shake hands across distance.

Credit: Electronics and Telecommunications Research Institute (ETRI) 

ETRI Unveils Next-Generation Remote Collaboration Technology

Korean researchers have unveiled next-generation remote collaboration technology that enables users to meet face-to-face and even shake hands in real-time with their counterparts, as if they were in the same room. The Electronics and Telecommunications Research Institute (ETRI) announced the public introduction of "telepresence augmentation technology for XR environments," which can precisely reproduce facial expressions, gazes, and even handshakes.

Immersive Face-to-Face Collaboration

The new technology enables two users in different physical locations to simultaneously participate in a meeting within Virtual Reality (VR) and Augmented Reality (AR) environments, allowing for immersive interaction as if they were sitting face-to-face in a real meeting room. Visitors were able to vividly experience a new form of collaboration by making eye contact and shaking hands with remote participants on-site.

Core Components

The remote presence augmentation technology developed by ETRI consists of two core components: exoskeleton-based active virtual handshake technology and real-time digital human stereoscopic immersion technology.

1. Exoskeleton-Based Virtual Handshake

This tactile feedback system enables users to feel not only the movement of their hands but also the strength and direction of another person's grip in real-time. To accomplish this, the researchers designed and developed an exoskeleton-type Active Type XR haptic glove. These haptic gloves go beyond simple vibrations to deliver precise tactile experiences, creating the sensation of a real handshake even in remote environments.

2. Real-Time Digital Human Realism

This technology utilizes generative AI in a head-mounted display (HMD) environment, employing binocular imaging to enhance the realism and natural facial expressions of digital humans in real-time. Traditionally, high-quality avatars required substantial system resources; however, this technology can even improve low-quality video sources to appear lifelike, enabling natural interaction during remote conferencing.

Demonstration Features

The demonstration system used HMDs and sensors to capture non-verbal cues from the user's hands, feet, gaze, facial expressions, mouth shape, and voice. In particular, 3DGS, an AI-powered 3D background generation technology, was applied to create a more realistic sense of space and immersion.

Through this system, visitors experienced eye contact and handshakes with remote participants via avatars, immersing themselves vividly in a collaboration that goes beyond traditional video conferencing.

Expert Insights

Dr. Jung Sung Uk, principal researcher of ETRI's Content Convergence Research Section, said, "This technology will be an important step in shifting the concept of remote collaboration beyond simple video calls to the era of realistic interaction. We will continue to develop it into a future collaboration solution that people can directly experience in various fields such as education, industry, and healthcare."

Research Collaboration

ETRI will continue to improve remote collaboration quality by integrating XR, AI, haptics, and immersive media. ETRI introduced this technology at the ETRI Conference 2025 held last month.

Glossary

  • XR: A technology that fuses the real and virtual worlds to create new user experiences, including VR, AR, and mixed reality.
  • AR: A technique for superimposing three-dimensional virtual images onto real-world environments.
  • VR: A technique for immersing users in three-dimensional virtual environments.
  • 3DGS: A video compositing technique that reconstructs realistic 3D scenes from ordinary photos.

Funding and Partnerships

Research on the exoskeleton-based handshake technology is being conducted jointly with the Korea Advanced Institute of Science and Technology (KAIST), Neofect Co., Ltd., and the University of Southampton (UK) as part of a grant supported by ETRI. The real-time stereoscopic realism technology was developed under the project "Development of photorealistic digital human creation and 30fps realistic rendering technology", part of the "Realistic Content Core Technology Development Project," funded by the Ministry of Science and ICT and the Institute of Information & Communications

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

Sign in to keep reading

We're committed to providing free access to quality science. By registering and providing insight into your preferences you're joining a community of over 1m science interested individuals and help us to provide you with insightful content whilst keeping our service free.

or

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
Korean AI Innovations Transform Global Broadcasting at NAB 2025