Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/106861
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Electrical and Electronic Engineering | - |
dc.creator | Lu, C | en_US |
dc.creator | Mak, MW | en_US |
dc.date.accessioned | 2024-06-06T06:06:01Z | - |
dc.date.available | 2024-06-06T06:06:01Z | - |
dc.identifier.issn | 0925-2312 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/106861 | - |
dc.language.iso | en | en_US |
dc.publisher | Elsevier BV | en_US |
dc.subject | Action recognition | en_US |
dc.subject | Intelligent video system | en_US |
dc.subject | Temporal action detection | en_US |
dc.title | DITA : DETR with improved queries for end-to-end temporal action detection | en_US |
dc.type | Journal/Magazine Article | en_US |
dc.identifier.doi | 10.1016/j.neucom.2024.127914 | en_US |
dcterms.abstract | The DEtection TRansformer (DETR), with its elegant architecture and set prediction, has revolutionized object detection. However, DETR-like models have yet to achieve comparable success in temporal action detection (TAD). To address this gap, we introduce a series of improvements to the original DETR, proposing a new DETR-based model for TAD that achieves competitive performance relative to conventional TAD methods. Specifically, we adapt advanced techniques from DETR variants used in object detection, including deformable attention, denoising training, and selective query recollection. Furthermore, we propose several new techniques aimed at enhancing detection precision and model convergence speed, such as geographic query grouping and learnable proposals. Leveraging these innovations, we introduce a new model called DETR with Improved queries for Temporal Action Detection (DITA). DITA not only adheres to DETR’s elegant design philosophy but is also competitive to state-of-the-art action detection models. Remarkably, it is the first TAD model to achieve an mAP over 70% on THUMOS14, outperforming the previous best DETR variant by 13.5 percentage points. | - |
dcterms.accessRights | embargoed access | en_US |
dcterms.bibliographicCitation | Neurocomputing. Available online 28 May 2024, In Press, Journal Pre-proof, 127914, https://doi.org/10.1016/j.neucom.2024.127914 | en_US |
dcterms.isPartOf | Neurocomputing | en_US |
dcterms.issued | 2024 | - |
dc.identifier.eissn | 1872-8286 | en_US |
dc.identifier.artn | 127914 | en_US |
dc.description.validate | 202406 bcch | - |
dc.identifier.FolderNumber | a2778 | - |
dc.identifier.SubFormID | 48310 | - |
dc.description.fundingSource | Self-funded | en_US |
dc.description.pubStatus | Early release | en_US |
dc.date.embargo | 0000-00-00 (to be updated) | en_US |
dc.description.oaCategory | Green (AAM) | en_US |
Appears in Collections: | Journal/Magazine Article |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.