Latent structured perceptrons for large-scale learning with hidden information

Xu Sun, Takuya Matsuzaki, Wenjie Li

Research output: Journal article publicationJournal articleAcademic researchpeer-review

6 Citations (Scopus)

Abstract

Many real-world data mining problems contain hidden information (e.g., unobservable latent dependencies). We propose a perceptron-style method, latent structured perceptron, for fast discriminative learning of structured classification with hidden information. We also give theoretical analysis and demonstrate good convergence properties of the proposed method. Our method extends the perceptron algorithm for the learning task with hidden information, which can be hardly captured by traditional models. It relies on Viterbi decoding over latent variables, combined with simple additive updates. We perform experiments on one synthetic data set and two real-world structured classification tasks. Compared to conventional nonlatent models (e.g., conditional random fields, structured perceptrons), our method is more accurate on real-world tasks. Compared to existing heavy probabilistic models of latent variables (e.g., latent conditional random fields), our method lowers the training cost significantly (almost one order magnitude faster) yet with comparable or even superior classification accuracy. In addition, experiments demonstrate that the proposed method has good scalability on large-scale problems.
Original languageEnglish
Article number6226404
Pages (from-to)2063-2075
Number of pages13
JournalIEEE Transactions on Knowledge and Data Engineering
Volume25
Issue number9
DOIs
Publication statusPublished - 8 Aug 2013

Keywords

  • Convergence analysis
  • Hidden information
  • Large-scale learning
  • Latent variable
  • Structured perceptron

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Latent structured perceptrons for large-scale learning with hidden information'. Together they form a unique fingerprint.

Cite this