Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/113278
DC FieldValueLanguage
dc.contributorDepartment of Applied Physicsen_US
dc.creatorGuo, Jen_US
dc.creatorGuo, Fen_US
dc.creatorZhao, Hen_US
dc.creatorYang, Hen_US
dc.creatorDu, Xen_US
dc.creatorFan, Fen_US
dc.creatorLiu, Wen_US
dc.creatorZhang, Yen_US
dc.creatorTu, Den_US
dc.creatorHao, Jen_US
dc.date.accessioned2025-06-02T02:27:44Z-
dc.date.available2025-06-02T02:27:44Z-
dc.identifier.issn0935-9648en_US
dc.identifier.urihttp://hdl.handle.net/10397/113278-
dc.language.isoenen_US
dc.publisherWiley-VCH Verlag GmbH & Co. KGaAen_US
dc.subjectIn-sensor computingen_US
dc.subjectMechanoluminescenceen_US
dc.subjectMechano-optical artificial synapseen_US
dc.subjectPhotostimulated luminescenceen_US
dc.subjectVisual-tactile perceptionen_US
dc.titleIn-sensor computing with visual-tactile perception enabled by mechano-optical artificial synapseen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume37en_US
dc.identifier.issue14en_US
dc.identifier.doi10.1002/adma.202419405en_US
dcterms.abstractIn-sensor computing paradigm holds the promise of realizing rapid and low-power signal processing. Constructing crossmodal in-sensor computing systems to emulate human sensory and recognition capabilities has been a persistent pursuit for developing humanoid robotics. Here, an artificial mechano-optical synapse is reported to implement in-sensor dynamic computing with visual-tactile perception. By employing mechanoluminescence (ML) material, direct conversion of the mechanical signals into light emission is achieved and the light is transported to an adjacent photostimulated luminescence (PSL) layer without pre- and post-irradiation. The PSL layer acts as a photon reservoir as well as a processing unit for achieving in-memory computing. The approach based on ML coupled with PSL material is different from traditional circuit–constrained methods, enabling remote operation and easy accessibility. Individual and synergistic plasticity are elaborately investigated under force and light pulses, including paired-pulse facilitation, learning behavior, and short-term and long-term memory. A multisensory neural network is built for processing the obtained handwritten patterns with a tablet consisting of the device, achieving a recognition accuracy of up to 92.5%. Moreover, material identification has been explored based on visual-tactile sensing, with an accuracy rate of 98.6%. This work provides a promising strategy to construct in-sensor computing systems with crossmodal integration and recognition.en_US
dcterms.accessRightsembargoed accessen_US
dcterms.bibliographicCitationAdvanced materials, 9 Apr. 2025, v. 37, no. 14, 2419405en_US
dcterms.isPartOfAdvanced materialsen_US
dcterms.issued2025-04-09-
dc.identifier.eissn1521-4095en_US
dc.identifier.artn2419405en_US
dc.description.validate202506 bcchen_US
dc.description.oaNot applicableen_US
dc.identifier.FolderNumbera3622-
dc.identifier.SubFormID50496-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextNational Natural Science Foundation of China; Open Fund of State Key Laboratory of Information Photonics and Optical Communications; Fundamental Research Funds for the Central Universities, Shenzhen Science and Technology Program; CUG Scholar Scientific Research Funds at China University of Geosciences (Wuhan)en_US
dc.description.pubStatusPublisheden_US
dc.date.embargo2026-04-09en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Open Access Information
Status embargoed access
Embargo End Date 2026-04-09
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.