Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/114289
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Data Science and Artificial Intelligence | en_US |
dc.creator | Yang, P | en_US |
dc.creator | Wang, Q | en_US |
dc.creator | Huang, Z | en_US |
dc.creator | Liu, T | en_US |
dc.creator | Zhang, C | en_US |
dc.creator | Han, B | en_US |
dc.date.accessioned | 2025-07-22T01:34:18Z | - |
dc.date.available | 2025-07-22T01:34:18Z | - |
dc.identifier.uri | http://hdl.handle.net/10397/114289 | - |
dc.language.iso | en | en_US |
dc.rights | Copyright 2025 by the author(s). | en_US |
dc.rights | Posted with permission of the author. | en_US |
dc.rights | The following publication Yang, P., Wang, Q., Huang, Z., Liu, T., Zhang, C., & Han, B. (2025). Exploring Criteria of Loss Reweighting to Enhance LLM Unlearning. In Forty-second International Conference on Machine Learning 2025 is available at https://icml.cc/virtual/2025/poster/44163. | en_US |
dc.title | Exploring criteria of loss reweighting to enhance LLM unlearning | en_US |
dc.type | Other Conference Contributions | en_US |
dcterms.abstract | Loss reweighting has shown significant benefits for machine unlearning with large language models (LLMs). However, their exact functionalities are left unclear and the optimal strategy remains an open question, thus impeding the understanding and improvement of existing methodologies. In this paper, we identify two distinct goals of loss reweighting, namely, Saturation and Importance---the former indicates that those insufficiently optimized data should be emphasized, while the latter stresses some critical data that are most influential for loss minimization. To study their usefulness, we design specific reweighting strategies for each goal and evaluate their respective effects on unlearning. We conduct extensive empirical analyses on well-established benchmarks, and summarize some important observations as follows:(i) Saturation enhances efficacy more than importance-based reweighting, and their combination can yield additional improvements.(ii) Saturation typically allocates lower weights to data with lower likelihoods, whereas importance-based reweighting does the opposite.(iii) The efficacy of unlearning is also largely influenced by the smoothness and granularity of the weight distributions.Based on these findings, we propose SatImp, a simple reweighting method that combines the advantages of both saturation and importance.Empirical results on extensive datasets validate the efficacy of our method, potentially bridging existing research gaps and indicating directions for future research.Our code is available at https://github.com/tmlr-group/SatImp. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | In ICML 2025: Forty-Second International Conference on Machine Learning, Vancouver Convention Center, July 13th - 19th 2025 [Poster], https://icml.cc/virtual/2025/poster/44163 | en_US |
dcterms.issued | 2025 | - |
dc.relation.conference | International Conference on Machine Learning [ICML] | en_US |
dc.description.validate | 202507 bcch | en_US |
dc.description.oa | Other Version | en_US |
dc.identifier.FolderNumber | a3947 | - |
dc.identifier.SubFormID | 51801 | - |
dc.description.fundingSource | RGC | en_US |
dc.description.fundingSource | Others | en_US |
dc.description.fundingText | NSFC General Program No. 62376235 | en_US |
dc.description.fundingText | Guangdong Basic and Applied Basic Research Foundation Nos. 2022A1515011652 and 2024A1515012399 | en_US |
dc.description.fundingText | HKBU Faculty Niche Research Areas No. RC-FNRA-IG/22-23/SCI/04 | en_US |
dc.description.fundingText | HKBU CSD Departmental Incentive Scheme | en_US |
dc.description.fundingText | Australian Research Council projects: FT220100318, DP220102121, LP220100527, LP220200949, and IC190100031 | en_US |
dc.description.oaCategory | Copyright retained by author | en_US |
Appears in Collections: | Conference Paper |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
ICML25.916_Exploring_Criteria_of_Loss.pdf | 3.09 MB | Adobe PDF | View/Open |
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.