By combining haptic gloves, generative AI, and immersive XR environments, ETRI is redefining remote collaboration, turning video calls into lifelike meetings where participants can truly shake hands across distance.

Credit: Electronics and Telecommunications Research Institute (ETRI)
ETRI Unveils Next-Generation Remote Collaboration Technology
Korean researchers have unveiled next-generation remote collaboration technology that enables users to meet face-to-face and even shake hands in real-time with their counterparts, as if they were in the same room. The Electronics and Telecommunications Research Institute (ETRI) announced the public introduction of "telepresence augmentation technology for XR environments," which can precisely reproduce facial expressions, gazes, and even handshakes.
Immersive Face-to-Face Collaboration
The new technology enables two users in different physical locations to simultaneously participate in a meeting within Virtual Reality (VR) and Augmented Reality (AR) environments, allowing for immersive interaction as if they were sitting face-to-face in a real meeting room. Visitors were able to vividly experience a new form of collaboration by making eye contact and shaking hands with remote participants on-site.
Core Components
The remote presence augmentation technology developed by ETRI consists of two core components: exoskeleton-based active virtual handshake technology and real-time digital human stereoscopic immersion technology.
1. Exoskeleton-Based Virtual Handshake
This tactile feedback system enables users to feel not only the movement of their hands but also the strength and direction of another person's grip in real-time. To accomplish this, the researchers designed and developed an exoskeleton-type Active Type XR haptic glove. These haptic gloves go beyond simple vibrations to deliver precise tactile experiences, creating the sensation of a real handshake even in remote environments.
2. Real-Time Digital Human Realism
This technology utilizes generative AI in a head-mounted display (HMD) environment, employing binocular imaging to enhance the realism and natural facial expressions of digital humans in real-time. Traditionally, high-quality avatars required substantial system resources; however, this technology can even improve low-quality video sources to appear lifelike, enabling natural interaction during remote conferencing.
Demonstration Features
The demonstration system used HMDs and sensors to capture non-verbal cues from the user's hands, feet, gaze, facial expressions, mouth shape, and voice. In particular, 3DGS, an AI-powered 3D background generation technology, was applied to create a more realistic sense of space and immersion.
Through this system, visitors experienced eye contact and handshakes with remote participants via avatars, immersing themselves vividly in a collaboration that goes beyond traditional video conferencing.
Expert Insights
Dr. Jung Sung Uk, principal researcher of ETRI's Content Convergence Research Section, said, "This technology will be an important step in shifting the concept of remote collaboration beyond simple video calls to the era of realistic interaction. We will continue to develop it into a future collaboration solution that people can directly experience in various fields such as education, industry, and healthcare."
Research Collaboration
ETRI will continue to improve remote collaboration quality by integrating XR, AI, haptics, and immersive media. ETRI introduced this technology at the ETRI Conference 2025 held last month.
Glossary
- XR: A technology that fuses the real and virtual worlds to create new user experiences, including VR, AR, and mixed reality.
- AR: A technique for superimposing three-dimensional virtual images onto real-world environments.
- VR: A technique for immersing users in three-dimensional virtual environments.
- 3DGS: A video compositing technique that reconstructs realistic 3D scenes from ordinary photos.
Funding and Partnerships
Research on the exoskeleton-based handshake technology is being conducted jointly with the Korea Advanced Institute of Science and Technology (KAIST), Neofect Co., Ltd., and the University of Southampton (UK) as part of a grant supported by ETRI. The real-time stereoscopic realism technology was developed under the project "Development of photorealistic digital human creation and 30fps realistic rendering technology", part of the "Realistic Content Core Technology Development Project," funded by the Ministry of Science and ICT and the Institute of Information & Communications