Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/118227
Title: Multi-object motion real-time perception enabled by all-in-one optoelectronic memory
Authors: Xie, X
Duan, S
Yang, Y 
Wang, Y
Wang, J
Gu, D
Hu, X
Wang, L
Li, P
Wu, B
Sun, B
Zhou, G
Issue Date: 2026
Source: Laser & photonics reviews, First published: 02 January 2026, Early View, https://doi.org/10.1002/lpor.202502501
Abstract: Motion perception is increasingly crucial in diverse artificial intelligence scenarios, but it is largely limited by the dual operation mode (light set and electrical reset) and the physical separation architecture. We propose an all-in-one optoelectronic memory that can provide positive and negative photoresponse to in situ execute image preprocessing, enabling the high efficiency frame difference for multiple moving object perception. This positive and negative photoresponse heavily relies on the light intensity that can alter structure phase of the fibroin protein. The developed optoelectronic memory exhibits short- and long-term synaptic plasticity under both positive and negative photoresponse, faithfully emulating human retina system to detect multiple motions in complex environment. Perception of the multi-object motion in real world is demonstrated, showing an accuracy of 95% and over 30 fps real-time processing speed. The all-in-one memory-enabled artificial retina system lays a significant in-sensor computing architecture for edge dynamic vision perception.
Keywords: Fibroin protein
In-sensor computing
Motion perception
Optoelectronic memory
Positive and negative photoresponse
Publisher: Wiley-VCH
Journal: Laser & photonics reviews 
ISSN: 1863-8880
EISSN: 1863-8899
DOI: 10.1002/lpor.202502501
Appears in Collections:Journal/Magazine Article

Open Access Information
Status embargoed access
Embargo End Date 0000-00-00 (to be updated)
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.