Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/116110
Title: High-order dynamics in an ultra-adaptive neuromorphic vision device
Authors: Xu, J
Jiang, B 
Wang, W 
Guo, Z
Gao, J
Hu, X
Qin, J
Ran, L
Lin, L
Cai, S 
Li, Y
Zhou, F
Issue Date: 15-Aug-2025
Source: Nature nanotechnology, 15 Aug. 2025, v. 20, p. 1419-1430
Abstract: Neuromorphic hardware for artificial general vision intelligence holds the potential to match and surpass biological visual systems by processing complex visual dynamics with high adaptability and efficiency. However, current implementations rely on multiple complementary metal–oxide–semiconductor or neuromorphic elements, leading to significant area and power inefficiencies and system complexity. This is owing to a key challenge that no single electronic device, to our knowledge, has yet been demonstrated that can integrate retina-like and cortex-like spiking and graded neuronal dynamics operable across both optical and electrical domains. Here we report a single ultra-adaptive neuromorphic vision device (IxTyO1–x–y/CuOx/Pd) by ingeniously tailoring its electronic properties, enabling uniquely controlled interface and bulk dynamics by charged particles, including electrons, oxygen ions and vacancies. The device highly amalgamates broadband retinal spiking neuron and non-spiking graded neuron, and cortical synapse and neuron dynamics, with ultralow power consumption. Real-time optoelectronic dynamics is elucidated through in situ scanning transmission electron microscopy and validated by technology computer-aided design simulations. An artificial general vision intelligence system based on homogeneous ultra-adaptive neuromorphic vision device arrays is constructed, adaptively supporting both asynchronous event-driven and synchronous frame-driven paradigms for versatile cognitive imaging demands, with superior power efficiency of up to 67.89 trillion operations per second per watt and area efficiency of up to 3.96 mega operations per second per feature size (MOPS/F2).
Keywords: Charged particles
Computer vision
Dynamics
Efficiency
Electron devices
Neural networks
Neurons
Ophthalmology
Transmission electron microscopy
Vision
Current
Complementary metal oxide semiconductors
High-order dynamics
Higher-order dynamics
Neuromorphic
Neuromorphic hardwares
Neuromorphic visions
Power
Systems complexity
Visual systems
Computer aided design
Publisher: Nature Publishing Group
Journal: Nature nanotechnology 
ISSN: 1748-3387
EISSN: 1748-3395
DOI: 10.1038/s41565-025-01984-3
Rights: © The Author(s) 2025.
This article is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License, which permits any non-commercial use, sharing, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if you modified the licensed material. You do not have permission under this licence to share adapted material derived from this article or parts of it. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc-nd/4.0/.
The following publication Xu, J., Jiang, B., Wang, W. et al. High-order dynamics in an ultra-adaptive neuromorphic vision device. Nat. Nanotechnol. 20, 1419–1430 (2025) is available at https://doi.org/10.1038/s41565-025-01984-3.
Appears in Collections:Journal/Magazine Article

Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.