Please use this identifier to cite or link to this item:
Title: Feature-Frequency-Adaptive on-line training for fast and accurate Natural Language Processing
Authors: Sun, X
Li, WJ 
Wang, HF
Lu, Q 
Issue Date: 2014
Publisher: MIT Press
Source: Computational linguistics, 2014, v. 40, no. 3, p. 563-586 How to cite?
Journal: Computational linguistics 
Abstract: Training speed and accuracy are two major concerns of large-scale natural language processing systems. Typically, we need to make a tradeoff between speed and accuracy. It is trivial to improve the training speed via sacrificing accuracy or to improve the accuracy via sacrificing speed. Nevertheless, it is nontrivial to improve the training speed and the accuracy at the same time, which is the target of this work. To reach this target, we present a new training method, feature-frequency-adaptive on-line training, for fast and accurate training of natural language processing systems. It is based on the core idea that higher frequency features should have a learning rate that decays faster. Theoretical analysis shows that the proposed method is convergent with a fast convergence rate. Experiments are conducted based on well-known benchmark tasks, including named entity recognition, word segmentation, phrase chunking, and sentiment analysis. These tasks consist of three structured classification tasks and one non-structured classification task, with binary features and real-valued features, respectively. Experimental results demonstrate that the proposed method is faster and at the same time more accurate than existing methods, achieving state-of-the-art scores on the tasks with different characteristics.
ISSN: 0891-2017 (print)
1530-9312 (online)
DOI: 10.1162/COLI_a_00193
Appears in Collections:Journal/Magazine Article

View full-text via PolyU eLinks SFX Query
Show full item record


Last Week
Last month
Citations as of Aug 10, 2018

Page view(s)

Last Week
Last month
Citations as of Aug 20, 2018

Google ScholarTM



Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.