Abstract
Many real-world data mining problems contain hidden information (e.g., unobservable latent dependencies). We propose a perceptron-style method, latent structured perceptron, for fast discriminative learning of structured classification with hidden information. We also give theoretical analysis and demonstrate good convergence properties of the proposed method. Our method extends the perceptron algorithm for the learning task with hidden information, which can be hardly captured by traditional models. It relies on Viterbi decoding over latent variables, combined with simple additive updates. We perform experiments on one synthetic data set and two real-world structured classification tasks. Compared to conventional nonlatent models (e.g., conditional random fields, structured perceptrons), our method is more accurate on real-world tasks. Compared to existing heavy probabilistic models of latent variables (e.g., latent conditional random fields), our method lowers the training cost significantly (almost one order magnitude faster) yet with comparable or even superior classification accuracy. In addition, experiments demonstrate that the proposed method has good scalability on large-scale problems.
Original language | English |
---|---|
Article number | 6226404 |
Pages (from-to) | 2063-2075 |
Number of pages | 13 |
Journal | IEEE Transactions on Knowledge and Data Engineering |
Volume | 25 |
Issue number | 9 |
DOIs | |
Publication status | Published - 8 Aug 2013 |
Keywords
- Convergence analysis
- Hidden information
- Large-scale learning
- Latent variable
- Structured perceptron
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Computational Theory and Mathematics