Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/115680
DC FieldValueLanguage
dc.contributorDepartment of Aeronautical and Aviation Engineeringen_US
dc.creatorLi, Zen_US
dc.creatorLi, Fen_US
dc.creatorXu, Gen_US
dc.creatorLi, Den_US
dc.date.accessioned2025-10-20T01:21:56Z-
dc.date.available2025-10-20T01:21:56Z-
dc.identifier.issn1524-9050en_US
dc.identifier.urihttp://hdl.handle.net/10397/115680-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.subjectAir traffic controlen_US
dc.subjectEye movementsen_US
dc.subjectLook-but-fail-to-see erroren_US
dc.subjectPeripheral visionen_US
dc.subjectVisual detectionen_US
dc.titleBeyond the gaze : peripheral vision-aware visual detection failures recognition through LLM-based fixation coordinate-sensitive analysisen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.doi10.1109/TITS.2025.3588621en_US
dcterms.abstractVisual detection failures are a critical challenge in air traffic control (ATC), where undetected alerts can compromise operational safety and decision-making. Previous studies have primarily assessed detection failures through target fixation patterns, yet this method struggles to identify the more complex “look-but-fail-to-see” and “see-without-looking” scenarios. This underscores the necessity of exploring peripheral vision mechanisms, where dynamic tracking trajectories could better capture the scope of visual attention. Therefore, this study proposes a classification framework for visual detection by integrating peripheral vision tracking and human attentional states, including detection failures such as peripheral vision neglect and look-but-fail-to-see errors. A hierarchical detection failure recognition framework specific to the ATC settings is further developed and validated through an ATC simulation experiment. The framework first employs an Adaptive Symbolic Alert Detection method to identify and annotate ATC-specific alert regions with spatiotemporal uncertainty (achieving 95.24% precision), followed by LLM-based evaluation of operators’ visual attention to these regions to intelligently assign classification labels. Additionally, we introduce a fixation coordinate-sensitive multi-domain feature set that captures spatiotemporal and frequency-domain characteristics across detection types, achieving 93.13% four-class classification accuracy, outperforming traditional feature sets (83.69%) and both single-and dual-domain features (ranging from 76.82% to 90.11% accuracy). These findings demonstrate that our framework effectively captures a broader and structured range of visual detection failures, providing critical insights to improve the reliability of alert detection in ATC and the design of an intelligent human-centered ATC support system.en_US
dcterms.accessRightsembargoed accessen_US
dcterms.bibliographicCitationIEEE transactions on intelligent transportation systems, Date of Publication: 21 July 2025, Early Access, https://doi.org/10.1109/TITS.2025.3588621en_US
dcterms.isPartOfIEEE transactions on intelligent transportation systemsen_US
dcterms.issued2025-
dc.identifier.scopus2-s2.0-105011711457-
dc.identifier.eissn1558-0016en_US
dc.description.validate202510 bcjzen_US
dc.description.oaNot applicableen_US
dc.identifier.SubFormIDG000226/2025-08-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextHong Kong Polytechnic University Research Centre Data Science AI (Grant Number: P0042711);National Natural Science Foundation of China (NSFC) Project (Grant Number: 52405295);Research Grants Council of the Hong Kong Special Administrative Region, China, ECS Project funded in (Grant Number: 2024/25);Exercise (Grant Number: PolyU 25233824)en_US
dc.description.pubStatusEarly releaseen_US
dc.date.embargo0000-00-00 (to be updated)en_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Open Access Information
Status embargoed access
Embargo End Date 0000-00-00 (to be updated)
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.