Feature-frequency-adaptive on-line training for fast and accurate natural language processing

Xu Sun, Wenjie Li, Houfeng Wang, Qin Lu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

12 Citations (Scopus)


Training speed and accuracy are two major concerns of large-scale natural language processing systems. Typically, we need to make a tradeoff between speed and accuracy. It is trivial to improve the training speed via sacrificing accuracy or to improve the accuracy via sacrificing speed. Nevertheless, it is nontrivial to improve the training speed and the accuracy at the same time, which is the target of this work. To reach this target, we present a new training method, featurefrequency-adaptive on-line training, for fast and accurate training of natural language processing systems. It is based on the core idea that higher frequency features should have a learning rate that decays faster. Theoretical analysis shows that the proposed method is convergent with a fast convergence rate. Experiments are conducted based on well-known benchmark tasks, including named entity recognition, word segmentation, phrase chunking, and sentiment analysis. These tasks consist of three structured classification tasks and one non-structured classification task, with binary features and real-valued features, respectively. Experimental results demonstrate that the proposed method is faster and at the same time more accurate than existing methods, achieving state-of-the-art scores on the tasks with different characteristics.
Original languageEnglish
Pages (from-to)563-586
Number of pages24
JournalComputational Linguistics
Issue number3
Publication statusPublished - 1 Jan 2014

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language
  • Computer Science Applications
  • Artificial Intelligence


Dive into the research topics of 'Feature-frequency-adaptive on-line training for fast and accurate natural language processing'. Together they form a unique fingerprint.

Cite this