Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/112102
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Applied Physicsen_US
dc.creatorRen, Qen_US
dc.creatorZhu, Cen_US
dc.creatorMa, Sen_US
dc.creatorWang, Zen_US
dc.creatorYan, Jen_US
dc.creatorWan, Ten_US
dc.creatorYan, Wen_US
dc.creatorChai, Yen_US
dc.date.accessioned2025-03-27T03:14:32Z-
dc.date.available2025-03-27T03:14:32Z-
dc.identifier.issn0935-9648en_US
dc.identifier.urihttp://hdl.handle.net/10397/112102-
dc.language.isoenen_US
dc.publisherWiley-VCH Verlag GmbH & Co. KGaAen_US
dc.rights© 2024 The Author(s). Advanced Materials published by Wiley-VCH GmbH. This is an open access article under the terms of the Creative Commons Attribution-NonCommercial License (http://creativecommons.org/licenses/by-nc/4.0/), which permits use, distribution and reproduction in any medium, provided the original work is properly cited and is not used for commercial purposes.en_US
dc.rightsThe following publication Q. Ren, C. Zhu, S. Ma, Z. Wang, J. Yan, T. Wan, W. Yan, Y. Chai, Optoelectronic Devices for In-Sensor Computing. Adv. Mater. 2025, 37, 2407476 is available at https://doi.org/10.1002/adma.202407476.en_US
dc.subjectData compressionen_US
dc.subjectData structuringen_US
dc.subjectIn-sensor computingen_US
dc.subjectOptoelectronic devicesen_US
dc.subjectSwitching mechanismsen_US
dc.titleOptoelectronic devices for in-sensor computingen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.volume37en_US
dc.identifier.issue23en_US
dc.identifier.doi10.1002/adma.202407476en_US
dcterms.abstractThe demand for accurate perception of the physical world leads to a dramatic increase in sensory nodes. However, the transmission of massive and unstructured sensory data from sensors to computing units poses great challenges in terms of power-efficiency, transmission bandwidth, data storage, time latency, and security. To efficiently process massive sensory data, it is crucial to achieve data compression and structuring at the sensory terminals. In-sensor computing integrates perception, memory, and processing functions within sensors, enabling sensory terminals to perform data compression and data structuring. Here, vision sensors are adopted as an example and discuss the functions of electronic, optical, and optoelectronic hardware for visual processing. Particularly, hardware implementations of optoelectronic devices for in-sensor visual processing that can compress and structure multidimensional vision information are examined. The underlying resistive switching mechanisms of volatile/nonvolatile optoelectronic devices and their processing operations are explored. Finally, a perspective on the future development of optoelectronic devices for in-sensor computing is provided.en_US
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationAdvanced materials, 12 June 2025, v. 37, no. 23, 2407476en_US
dcterms.isPartOfAdvanced materialsen_US
dcterms.issued2025-06-12-
dc.identifier.scopus2-s2.0-85198507477-
dc.identifier.eissn1521-4095en_US
dc.identifier.artn2407476en_US
dc.description.validate202503 bcchen_US
dc.description.oaVersion of Recorden_US
dc.identifier.FolderNumberOA_TA-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextMOST National Key Technologies R&D Programme; Hong Kong Polytechnic University; Hong Kong Polytechnic University Shenzhen Research Instituteen_US
dc.description.pubStatusPublisheden_US
dc.description.TAWiley (2024)en_US
dc.description.oaCategoryTAen_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Ren_Optoelectronic_Devices_In‐Sensor.pdf5.41 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Version of Record
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Page views

3
Citations as of Apr 14, 2025

SCOPUSTM   
Citations

26
Citations as of Oct 3, 2025

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.