Please use this identifier to cite or link to this item:
Title: Adaptive thresholding for multi-label SVM classification with application to protein subcellular localization prediction
Authors: Wan, SB
Mak, MW 
Kung, SY
Keywords: Adaptive thresholding
Gene Ontology
Multi-label SVM
Multi-label classification
Protein subcellular localization
Issue Date: 2013
Publisher: IEEE
Source: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 26-31 May 2013, Vancouver, BC, p. 3547-3551 How to cite?
Abstract: Multi-label classification has received increasing attention in computational proteomics, especially in protein subcellular localization. Many existing multi-label protein predictors suffer from over-prediction because they use a fixed decision threshold to determine the number of labels to which a query protein should be assigned. To address this problem, this paper proposes an adaptive thresholding scheme for multi-label support vector machine (SVM) classifiers. Specifically, each one-vs-rest SVM has an adaptive threshold that is a fraction of the maximum score of the one-vs-rest SVMs in the classifier. Therefore, the number of class labels of the query protein depends on the confidence of the SVMs in the classification. This scheme is integrated into our recently proposed subcellular localization predictor that uses the frequency of occurrences of gene-ontology terms as feature vectors and one-vs-rest SVMs as classifiers. Experimental results on two recent datasets suggest that the scheme can effectively avoid both over-prediction and under-prediction, resulting in performance significantly better than other gene-ontology based subcellular localization predictors.
ISSN: 1520-6149
DOI: 10.1109/ICASSP.2013.6638318
Appears in Collections:Conference Paper

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Jul 5, 2018

Page view(s)

Last Week
Last month
Citations as of Jul 10, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.