Please use this identifier to cite or link to this item: http://hdl.handle.net/10397/117078
PIRA download icon_1.1View/Download Full Text
DC FieldValueLanguage
dc.contributorDepartment of Electrical and Electronic Engineering-
dc.creatorXue, Qen_US
dc.creatorYe, Qen_US
dc.creatorHu, Hen_US
dc.creatorLou, Jen_US
dc.creatorLi, Jen_US
dc.creatorFang, Cen_US
dc.creatorShi, Jen_US
dc.date.accessioned2026-02-02T03:52:54Z-
dc.date.available2026-02-02T03:52:54Z-
dc.identifier.issn1545-5971en_US
dc.identifier.urihttp://hdl.handle.net/10397/117078-
dc.language.isoenen_US
dc.publisherInstitute of Electrical and Electronics Engineersen_US
dc.rights© 2025 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.en_US
dc.rightsThe following publication Q. Xue et al., 'LabelDP Leaks Privacy — A Tightened Correlation-Aware Privacy Model for Labeled Training Data,' in IEEE Transactions on Dependable and Secure Computing, vol. 23, no. 1, pp. 491-506, Jan.-Feb. 2026 is available at https://doi.org/10.1109/TDSC.2025.3607786.en_US
dc.subjectJoint distribution estimationen_US
dc.subjectLabeled data collectionen_US
dc.subjectLocal differential privacyen_US
dc.titleLabelDP leaks privacy - a tightened correlation-aware privacy model for labeled training dataen_US
dc.typeJournal/Magazine Articleen_US
dc.identifier.spage491en_US
dc.identifier.epage506en_US
dc.identifier.volume23en_US
dc.identifier.issue1en_US
dc.identifier.doi10.1109/TDSC.2025.3607786en_US
dcterms.abstractIt is well understood that the accuracy of machine learning models heavily depends on the amount of training data collected from individuals. However, the collection of sensitive information brings privacy risks to users. Recently, differential privacy (DP) has emerged as a rigorous privacy model for sensitive data collection. When applying DP to training data collection, a common practice to improve utility is that labels are sanitized whereas attribute values are not, a.k.a., label differential privacy (LabelDP). In this paper, we point out that LabelDP can hardly guarantee the expected privacy on labels due to the correlation between attributes and labels. To address this privacy leakage, we propose a stronger privacy model, correlation-aware label local differential privacy (CLLDP), to protect each individual user with the consideration of correlations between attributes and labels. Under CLLDP, we propose a perturbation protocol k heads response (kHR) to estimate the joint probabilistic distribution of attributes and labels. This distribution can be used for a variety of machine learning tasks, such as Naïve Bayes and decision tree, both of which are illustrated in this paper. Through extensive experiments, we show the strong privacy guarantee of CLLDP and its effectiveness in real-life machine learning tasks.-
dcterms.accessRightsopen accessen_US
dcterms.bibliographicCitationIEEE transactions on dependable and secure computing, Jan. - Feb. 2026, v. 23, no. 1, p. 491-506en_US
dcterms.isPartOfIEEE transactions on dependable and secure computingen_US
dcterms.issued2026-01-
dc.identifier.scopus2-s2.0-105015393369-
dc.identifier.eissn1941-0018en_US
dc.description.validate202602 bcjz-
dc.description.oaAccepted Manuscripten_US
dc.identifier.SubFormIDG000773/2025-10-
dc.description.fundingSourceRGCen_US
dc.description.fundingSourceOthersen_US
dc.description.fundingTextThis work was supported in part by the National Natural Science Foundation of China under Grant 62372122, Grant 92270123, and Grant 62302214, in part by Research Grants Council under Grant 15208923 and Grant 25207224, and in part by Innovation and Technology Fund under Grant GHP/392/22GD, Hong Kong SAR, China.en_US
dc.description.pubStatusPublisheden_US
dc.description.oaCategoryGreen (AAM)en_US
Appears in Collections:Journal/Magazine Article
Files in This Item:
File Description SizeFormat 
Xue_LabelDP_Leaks_Privacy.pdfPre-Published version1.29 MBAdobe PDFView/Open
Open Access Information
Status open access
File Version Final Accepted Manuscript
Access
View full-text via PolyU eLinks SFX Query
Show simple item record

Google ScholarTM

Check

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.