Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/110791
| DC Field | Value | Language |
|---|---|---|
| dc.contributor | Department of Electrical and Electronic Engineering | en_US |
| dc.creator | Jin, Z | en_US |
| dc.creator | Tu, Y | en_US |
| dc.creator | Gan, CX | en_US |
| dc.creator | Mak, MW | en_US |
| dc.creator | Lee, KA | en_US |
| dc.date.accessioned | 2025-02-04T07:11:10Z | - |
| dc.date.available | 2025-02-04T07:11:10Z | - |
| dc.identifier.issn | 0925-2312 | en_US |
| dc.identifier.uri | http://hdl.handle.net/10397/110791 | - |
| dc.language.iso | en | en_US |
| dc.publisher | Elsevier BV | en_US |
| dc.rights | © 2025 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons.org/licenses/by/4.0/). | en_US |
| dc.rights | The following publication Jin, Z., Tu, Y., Gan, C.-X., Mak, M.-W., & Lee, K.-A. (2025). Adversarially adaptive temperatures for decoupled knowledge distillation with applications to speaker verification. Neurocomputing, 624, 129481 is available at https://doi.org/10.1016/j.neucom.2025.129481. | en_US |
| dc.subject | Adaptive temperature | en_US |
| dc.subject | Adversarial learning | en_US |
| dc.subject | Knowledge distillation | en_US |
| dc.subject | Speaker verification | en_US |
| dc.title | Adversarially adaptive temperatures for decoupled knowledge distillation with applications to speaker verification | en_US |
| dc.type | Journal/Magazine Article | en_US |
| dc.identifier.volume | 624 | en_US |
| dc.identifier.doi | 10.1016/j.neucom.2025.129481 | en_US |
| dcterms.abstract | Knowledge Distillation (KD) aims to transfer knowledge from a high-capacity teacher model to a lightweight student model, thereby enabling the student model to attain a level of performance that would be unattainable through conventional training methods. In conventional KD, the loss function’s temperature that controls the smoothness of class distributions is fixed. We argue that distribution smoothness is critical to the transfer of knowledge and propose an adversarial adaptive temperature module to set the temperature dynamically during training to enhance the student’s performance. Using the concept of decoupled knowledge distillation (DKD), we separate the Kullback–Leibler (KL) divergence into a target-class term and a non-target-class term. However, unlike DKD, we adversarially update the temperature coefficients of the target and non-target classes to maximize the distillation loss. We named our method Adversarially Adaptive Temperature for DKD (AAT-DKD). Our approach demonstrates improvements over KD methods across three test sets of Voxceleb1 for two student models (x-vector and ECAPA-TDNN). Specifically, compared to the traditional KD and DKD, our method achieves a remarkable reduction of 17.78% and 11.90% in EER using ECAPA-TDNN speaker embedding. Moreover, our method performs well on CN-Celeb and VoxSRC21, further highlighting its robustness and effectiveness across different datasets. | en_US |
| dcterms.accessRights | open access | en_US |
| dcterms.bibliographicCitation | Neurocomputing, 1 Apr. 2025, v. 624, 129481 | en_US |
| dcterms.isPartOf | Neurocomputing | en_US |
| dcterms.issued | 2025-04-01 | - |
| dc.identifier.scopus | 2-s2.0-85216017012 | - |
| dc.identifier.eissn | 1872-8286 | en_US |
| dc.identifier.artn | 129481 | en_US |
| dc.description.validate | 202502 bcch | en_US |
| dc.description.oa | Version of Record | en_US |
| dc.identifier.FolderNumber | OA_TA, a3641 | - |
| dc.identifier.SubFormID | 50553 | - |
| dc.description.fundingSource | RGC | en_US |
| dc.description.pubStatus | Published | en_US |
| dc.description.TA | Elsevier (2025) | en_US |
| dc.description.oaCategory | TA | en_US |
| Appears in Collections: | Journal/Magazine Article | |
Files in This Item:
| File | Description | Size | Format | |
|---|---|---|---|---|
| 1-s2.0-S0925231225001535-main.pdf | 2.15 MB | Adobe PDF | View/Open |
Page views
18
Citations as of Apr 14, 2025
Downloads
7
Citations as of Apr 14, 2025
SCOPUSTM
Citations
2
Citations as of Dec 19, 2025
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.



