Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/106861
Title: DITA : DETR with improved queries for end-to-end temporal action detection
Authors: Lu, C 
Mak, MW 
Issue Date: 2024
Source: Neurocomputing. Available online 28 May 2024, In Press, Journal Pre-proof, 127914, https://doi.org/10.1016/j.neucom.2024.127914
Abstract: The DEtection TRansformer (DETR), with its elegant architecture and set prediction, has revolutionized object detection. However, DETR-like models have yet to achieve comparable success in temporal action detection (TAD). To address this gap, we introduce a series of improvements to the original DETR, proposing a new DETR-based model for TAD that achieves competitive performance relative to conventional TAD methods. Specifically, we adapt advanced techniques from DETR variants used in object detection, including deformable attention, denoising training, and selective query recollection. Furthermore, we propose several new techniques aimed at enhancing detection precision and model convergence speed, such as geographic query grouping and learnable proposals. Leveraging these innovations, we introduce a new model called DETR with Improved queries for Temporal Action Detection (DITA). DITA not only adheres to DETR’s elegant design philosophy but is also competitive to state-of-the-art action detection models. Remarkably, it is the first TAD model to achieve an mAP over 70% on THUMOS14, outperforming the previous best DETR variant by 13.5 percentage points.
Keywords: Action recognition
Intelligent video system
Temporal action detection
Publisher: Elsevier BV
Journal: Neurocomputing 
ISSN: 0925-2312
EISSN: 1872-8286
DOI: 10.1016/j.neucom.2024.127914
Appears in Collections:Journal/Magazine Article

Open Access Information
Status embargoed access
Embargo End Date 0000-00-00 (to be updated)
Access
View full-text via PolyU eLinks SFX Query
Show full item record

Page views

4
Citations as of Jun 30, 2024

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.