
Professor Ko Seung-hwan’s research team from the Department of Mechanical Engineering, College of Engineering, Seoul National University, is garnering global attention for developing innovative artificial skin technology that enables robots to simultaneously detect ambient temperature and pressure, much like humans. This technology efficiently senses thermal and mechanical stimuli with a single ultra-thin sensor, and is expected to become a core driving force for the realization of Physical AI.
The Era of Physical AI: The Importance of Technology Mimicking ‘Human Touch’
Recently, ‘Physical AI’ technology, where robots and artificial intelligence interact with the real physical world, has been rapidly emerging. Physical AI aims to enable robots to go beyond mere calculations, developing the ability to directly see, touch, feel, and make independent judgments about their surrounding environment. In this process, the importance of sensors that can simultaneously detect various tactile information such as temperature and pressure, much like human skin, is growing.
Existing multimodal sensory devices have been implemented by separately arranging multiple sensors or stacking multiple functional layers to mimic the complex stimulus processing method of human skin. However, these methods complicate the system structure and make measurement devices bulky. Furthermore, they suffer from inherent limitations such as slow response speeds due to reactive elements and difficulty in precisely reading multiple stimuli from the same location. Consequently, there has been an urgent need for the development of a new artificial tactile platform that can quickly process complex stimuli like human skin, using a thin, flexible, single sensory element.
Identification of 20 Object Types with Ultra-Thin Sensor… Innovative Multimodal Artificial Skin
To overcome these technical challenges, Professor Ko Seung-hwan’s team utilized a core-shell nanowire network consisting of an silver (Ag) core and a copper oxide (Cu2O) shell. They implemented a unique technology that switches between thermal sensing mode (T mode) and mechanical sensing mode (M mode) 16 times per second within a single device. Thanks to this ultra-thin single-layer structure, the developed artificial skin boasts extremely fast response speeds: sub-microseconds for mechanical stimuli and milliseconds for thermal stimuli.
Notably, in basic object classification experiments, the research team trained an AI model using ‘interleaving’ signals from the two sensing modes, meaning different signals were measured and collected alternately at very short time intervals. As a result, the classification accuracy, which was approximately 65% when using only thermal or mechanical signals, dramatically improved to 95%. Even under conditions with reduced data, a high accuracy of 94.53% was maintained. Furthermore, a fingertip-attachable sensor combined with a wireless measurement board successfully verified 20 types of everyday objects, achieving an 83% verification accuracy, thus demonstrating its applicability in real-life scenarios. The team also succeeded in manufacturing a multi-array platform to measure thermal and pressure distributions with high resolution comparable to human skin, suggesting scalability beyond single-device performance.
Opening New Horizons for Future Robots and Human-Machine Interaction
The multimodal artificial sensory device developed in this study is expected to become a core tactile perception technology for future robots across a wide range of fields, including medical assistive devices (prosthetic hands/feet), wearable electronic skin, soft robots, robotic grippers, and human-machine interfaces. Particularly, its ability to process complex stimuli with a single ultra-thin device, without the need for complex stacking of multiple sensors, simplifies systems and ensures high sensory resolution. This makes it a highly promising core technology for next-generation intelligent tactile platforms.
Professor Ko Seung-hwan emphasized the significance of this achievement, stating, βIt is highly meaningful because we have, for the first time, implemented a technology that can process both thermal and mechanical stimuli together within a single ultra-thin device, like human skin, without stacking multiple sensors.β This technology is expected to expand into a core technology for Physical AI, endowing robots with human-level tactile perception capabilities. Kim Kwon-kyu and Bang Jun-hyeok, the first authors of the paper, are continuing their research at Apple Inc. in the US and the California Institute of Technology, respectively, further broadening the international impact of this technology.
This research by Professor Ko Seung-hwan’s team at Seoul National University will be an important milestone marking the dawn of the Physical AI era, where robots perceive and interact with the physical world more precisely. Attention is drawn to what kind of future this technology’s development, bringing innovation to various industrial fields with a delicacy surpassing human touch, will create.
