Probabilistic Methods for Approximate Archetypal Analysis

Ruijian Han, Braxton Osting, Dong Wang, Yiming Xu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

Abstract

Archetypal analysis (AA) is an unsupervised learning method for exploratory data analysis. One major challenge that limits the applicability of AA in practice is the inherent computational complexity of the existing algorithms. In this paper, we provide a novel approximation approach to partially address this issue. Utilizing probabilistic ideas from high-dimensional geometry, we introduce two preprocessing techniques to reduce the dimension and representation cardinality of the data, respectively. We prove that provided data are approximately embedded in a low-dimensional linear subspace and the convex hull of the corresponding representations is well approximated by a polytope with a few vertices, our method can effectively reduce the scaling of AA. Moreover, the solution of the reduced problem is near-optimal in terms of prediction errors. Our approach can be combined with other acceleration techniques to further mitigate the intrinsic complexity of AA. We demonstrate the usefulness of our results by applying our method to summarize several moderately large-scale datasets.
Original languageEnglish
Pages (from-to)466-493
Number of pages28
JournalInformation and Inference: A Journal of the IMA
Volume12
Issue number1
DOIs
Publication statusPublished - 12 May 2022
Externally publishedYes

Keywords

  • alternating minimization
  • approximate convex hulls
  • archetypal analysis
  • dimensionality reduction
  • random projections
  • randomized SVD

Fingerprint

Dive into the research topics of 'Probabilistic Methods for Approximate Archetypal Analysis'. Together they form a unique fingerprint.

Cite this