Abstract
Archetypal analysis (AA) is an unsupervised learning method for exploratory data analysis. One major challenge that limits the applicability of AA in practice is the inherent computational complexity of the existing algorithms. In this paper, we provide a novel approximation approach to partially address this issue. Utilizing probabilistic ideas from high-dimensional geometry, we introduce two preprocessing techniques to reduce the dimension and representation cardinality of the data, respectively. We prove that provided data are approximately embedded in a low-dimensional linear subspace and the convex hull of the corresponding representations is well approximated by a polytope with a few vertices, our method can effectively reduce the scaling of AA. Moreover, the solution of the reduced problem is near-optimal in terms of prediction errors. Our approach can be combined with other acceleration techniques to further mitigate the intrinsic complexity of AA. We demonstrate the usefulness of our results by applying our method to summarize several moderately large-scale datasets.
Original language | English |
---|---|
Pages (from-to) | 466-493 |
Number of pages | 28 |
Journal | Information and Inference: A Journal of the IMA |
Volume | 12 |
Issue number | 1 |
DOIs | |
Publication status | Published - 12 May 2022 |
Externally published | Yes |
Keywords
- alternating minimization
- approximate convex hulls
- archetypal analysis
- dimensionality reduction
- random projections
- randomized SVD