Please use this identifier to cite or link to this item:
http://hdl.handle.net/10397/88201
DC Field | Value | Language |
---|---|---|
dc.contributor | Department of Applied Mathematics | en_US |
dc.creator | Qin, H | en_US |
dc.creator | Guo, X | en_US |
dc.date.accessioned | 2020-09-23T08:02:19Z | - |
dc.date.available | 2020-09-23T08:02:19Z | - |
dc.identifier.issn | 0219-5305 | en_US |
dc.identifier.uri | http://hdl.handle.net/10397/88201 | - |
dc.description | Title of accepted manuscript "On semi-supervised learning with summary statistics" | en_US |
dc.language.iso | en | en_US |
dc.publisher | World Scientific | en_US |
dc.rights | Electronic version of an article published as Analysis and Applications, vol. 17, no. 5, 2019, p. 837-851, https://doi.org/10.1142/S0219530519400037, © World Scientific Publishing Company, https://www.worldscientific.com/toc/aa/17/05 | en_US |
dc.subject | Distributed learning | en_US |
dc.subject | Semi-supervised learning | en_US |
dc.subject | Empirical features | en_US |
dc.subject | Summary statistics | en_US |
dc.subject | Privacy protection | en_US |
dc.title | Semi-supervised learning with summary statistics | en_US |
dc.type | Journal/Magazine Article | en_US |
dc.identifier.spage | 837 | en_US |
dc.identifier.epage | 851 | en_US |
dc.identifier.volume | 17 | en_US |
dc.identifier.issue | 5 | en_US |
dc.identifier.doi | 10.1142/S0219530519400037 | en_US |
dcterms.abstract | Nowadays, the extensive collection and analyzing of data is stimulating widespread privacy concerns, and therefore is increasing tensions between the potential sources of data and researchers. A privacy-friendly learning framework can help to ease the tensions, and to free up more data for research. We propose a new algorithm, LESS (Learning with Empirical feature-based Summary statistics from Semi-supervised data), which uses only summary statistics instead of raw data for regression learning. The selection of empirical features serves as a trade-off between prediction precision and the protection of privacy. We show that LESS achieves the minimax optimal rate of convergence in terms of the size of the labeled sample. LESS extends naturally to the applications where data are separately held by different sources. Compared with the existing literature on distributed learning, LESS removes the restriction of minimum sample size on single data sources. | en_US |
dcterms.accessRights | open access | en_US |
dcterms.bibliographicCitation | Analysis and applications, Sept. 2019, v. 17, no. 5, p. 837-851 | en_US |
dcterms.isPartOf | Analysis and applications | en_US |
dcterms.issued | 2019-09 | - |
dc.identifier.eissn | 1793-6861 | en_US |
dc.description.validate | 202009 bcrc | en_US |
dc.description.oa | Accepted Manuscript | en_US |
dc.identifier.FolderNumber | a0481-n02 | en_US |
dc.description.pubStatus | Published | en_US |
dc.description.oaCategory | Green (AAM) | en_US |
Appears in Collections: | Journal/Magazine Article |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
QinGuoRev1.pdf | Pre-Published version | 731.19 kB | Adobe PDF | View/Open |
Page views
205
Last Week
2
2
Last month
Citations as of Mar 31, 2025
Downloads
67
Citations as of Mar 31, 2025
SCOPUSTM
Citations
2
Citations as of Jul 4, 2024
WEB OF SCIENCETM
Citations
2
Citations as of Oct 10, 2024

Google ScholarTM
Check
Altmetric
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.