There is no author summary for this article yet. Authors can add summaries to their articles on ScienceOpen to make them more accessible to a non-specialist audience.
Abstract
The human-machine interface (HMI) previously relied on a single perception interface
that cannot realize three-dimensional (3D) interaction and convenient and accurate
interaction in multiple scenes. Here, we propose a collaborative interface including
electrooculography (EOG) and tactile perception for fast and accurate 3D human-machine
interaction. The EOG signals are mainly used for fast, convenient, and contactless
2D (XY-axis) interaction, and the tactile sensing interface is mainly utilized for
complex 2D movement control and Z-axis control in the 3D interaction. The honeycomb
graphene electrodes for the EOG signal acquisition and tactile sensing array are prepared
by a laser-induced process. Two pairs of ultrathin and breathable honeycomb graphene
electrodes are attached around the eyes for monitoring nine different eye movements.
A machine learning algorithm is designed to train and classify the nine different
eye movements with an average prediction accuracy of 92.6%. Furthermore, an ultrathin
(90 μm), stretchable (∼1000%), and flexible tactile sensing interface assembled by
a pair of 4 × 4 planar electrode arrays is attached to the arm for 2D movement control
and Z-axis interaction, which can realize single-point, multipoint and sliding touch
functions. Consequently, the tactile sensing interface can achieve eight directions
control and even more complex movement trajectory control. Meanwhile, the flexible
and ultrathin tactile sensor exhibits an ultrahigh sensitivity of 1.428 kPa-1 in the
pressure range 0-300 Pa with long-term response stability and repeatability. Therefore,
the collaboration between EOG and the tactile perception interface will play an important
role in rapid and accurate 3D human-machine interaction.
Synthesis and patterning of carbon nanomaterials cost effectively is a challenge in electronic and energy storage devices. Here report a one-step, scalable approach for producing and patterning porous graphene films with 3-dimensional networks from commercial polymer films using a CO2 infrared laser. The sp3-carbon atoms are photothermally converted to sp2-carbon atoms by pulsed laser irradiation. The resulting laser-induced graphene (LIG) exhibits high electrical conductivity. The LIG can be readily patterned to interdigitated electrodes for in-plane microsupercapacitors with specific capacitances of >4 mF·cm−2 and power densities of ~9 mW·cm−2. Theoretical calculations partially suggest that enhanced capacitance may result from LIG’s unusual ultra-polycrystalline lattice of pentagon-heptagon structures. Combined with the advantage of one-step processing of LIG in air from commercial polymer sheets, which would allow the employment of a roll-to-roll manufacturing process, this technique provides a rapid route to polymer-written electronic and energy storage devices.
Traditional technologies for virtual reality (VR) and augmented reality (AR) create human experiences through visual and auditory stimuli that replicate sensations associated with the physical world. The most widespread VR and AR systems use head-mounted displays, accelerometers and loudspeakers as the basis for three-dimensional, computer-generated environments that can exist in isolation or as overlays on actual scenery. In comparison to the eyes and the ears, the skin is a relatively underexplored sensory interface for VR and AR technology that could, nevertheless, greatly enhance experiences at a qualitative level, with direct relevance in areas such as communications, entertainment and medicine1,2. Here we present a wireless, battery-free platform of electronic systems and haptic (that is, touch-based) interfaces capable of softly laminating onto the curved surfaces of the skin to communicate information via spatio-temporally programmable patterns of localized mechanical vibrations. We describe the materials, device structures, power delivery strategies and communication schemes that serve as the foundations for such platforms. The resulting technology creates many opportunities for use where the skin provides an electronically programmable communication and sensory input channel to the body, as demonstrated through applications in social media and personal engagement, prosthetic control and feedback, and gaming and entertainment.
scite shows how a scientific paper has been cited by providing the context of the citation, a classification describing whether it supports, mentions, or contrasts the cited claim, and a label indicating in which section the citation was made.