Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/95579
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Electronic and Information Engineering | en_US |
dc.creator | Zheng, H | en_US |
dc.creator | Hu, H | en_US |
dc.creator | Han, Z | en_US |
dc.date.accessioned | 2022-09-22T06:13:57Z | - |
dc.date.available | 2022-09-22T06:13:57Z | - |
dc.identifier.issn | 1541-1672 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/95579 | - |
dc.language.iso | en | en_US |
dc.publisher | Institute of Electrical and Electronics Engineers | en_US |
dc.rights | © 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. | en_US |
dc.rights | The following publication H. Zheng, H. Hu and Z. Han, "Preserving User Privacy for Machine Learning: Local Differential Privacy or Federated Machine Learning?," in IEEE Intelligent Systems, vol. 35, no. 4, pp. 5-14, 1 July-Aug. 2020 is available at https://doi.org/10.1109/MIS.2020.3010335. | en_US |
dc.subject | Federated machine learning | en_US |
dc.subject | Local differential privacy | en_US |
dc.title | Preserving user privacy for machine learning: local differential privacy or federated machine learning? | en_US |
dc.type | Journal/Magazine Article | en_US |
dc.identifier.spage | 5 | en_US |
dc.identifier.epage | 14 | en_US |
dc.identifier.volume | 35 | en_US |
dc.identifier.issue | 4 | en_US |
dc.identifier.doi | 10.1109/MIS.2020.3010335 | en_US |
dcterms.abstract | The growing number of mobile and IoT devices has nourished many intelligent applications. In order to produce high-quality machine learning models, they constantly access and collect rich personal data such as photos, browsing history, and text messages. However, direct access to personal data has raised increasing public concerns about privacy risks and security breaches. To address these concerns, there are two emerging solutions to privacy-preserving machine learning, namely local differential privacy and federated machine learning. The former is a distributed data collection strategy where each client perturbs data locally before submitting to the server, whereas the latter is a distributed machine learning strategy to train models on mobile devices locally and merge their output (e.g., parameter updates of a model) through a control protocol. In this article, we conduct a comparative study on the efficiency and privacy of both solutions. Our results show that in a standard population and domain setting, both can achieve an optimal misclassification rate lower than 20% and federated machine learning generally performs better at the cost of higher client CPU usage. Nonetheless, local differential privacy can benefit more from a larger client population ($>$> 1k). As for privacy guarantee, local differential privacy also has flexible control over the data leakage. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | IEEE intelligent systems, July-Aug. 2020, v. 35, no. 4, 9144394, p. 5-14 | en_US |
dcterms.isPartOf | IEEE intelligent systems | en_US |
dcterms.issued | 2020-07 | - |
dc.identifier.scopus | 2-s2.0-85089291888 | - |
dc.identifier.eissn | 1941-1294 | en_US |
dc.identifier.artn | 9144394 | en_US |
dc.description.validate | 202209_bcww | en_US |
dc.description.oa | Accepted Manuscript | en_US |
dc.identifier.FolderNumber | EIE-0187 | - |
dc.description.fundingSource | RGC | en_US |
dc.description.pubStatus | Published | en_US |
dc.identifier.OPUS | 27668326 | - |
dc.description.oaCategory | Green (AAM) | en_US |
Appears in Collections: | Journal/Magazine Article |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Zheng_Preserving_User_Privacy.pdf | Pre-Published version | 1.54 MB | Adobe PDF | View/Open |
Page views
61
Last Week
0
0
Last month
Citations as of Oct 13, 2024
Downloads
344
Citations as of Oct 13, 2024
SCOPUSTM
Citations
62
Citations as of Oct 17, 2024
WEB OF SCIENCETM
Citations
48
Citations as of Oct 10, 2024
Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.