Wearable strain sensors that detect joint/muscle strain changes become prevalent at human–machine interfaces for full-body motion monitoring. However, most wearable devices cannot offer customizable opportunities to match the sensor characteristics with specific deformation ranges of joints/muscles, resulting in suboptimal performance. Adequate wearable strain sensor design is highly required to achieve user-designated working windows without sacrificing high sensitivity, accompanied with real-time data processing. Herein, wearable Ti 3C 2T x MXene sensor modules are fabricated with in-sensor machine learning (ML) models, either functioning via wireless streaming or edge computing, for full-body motion classifications and avatar reconstruction. Through topographic design on piezoresistive nanolayers, the wearable strain sensor modules exhibited ultrahigh sensitivities within the working windows that meet all joint deformation ranges. By integrating the wearable sensors with a ML chip, an edge sensor module is fabricated, enabling in-sensor reconstruction of high-precision avatar animations that mimic continuous full-body motions with an average avatar determination error of 3.5 cm, without additional computing devices.
Wearable sensors with edge computing are desired for human motion monitoring. Here, the authors demonstrate a topographic design for wearable MXene sensor modules with wireless streaming or in-sensor computing models for avatar reconstruction.