Abstract
Motivated by statistical learning theoretic treatment of principal component analysis, we are concerned with the set of points in ℝd that are within a certain distance from a k-dimensional affine subspace. We prove that the VC dimension of the class of such sets is within a constant factor of (k+1)(d-k+1), and then discuss the distribution of eigenvalues of a data covariance matrix by using our bounds of the VC dimensions and Vapnik's statistical learning theory. In the course of the upper bound proof, we provide a simple proof of Warren's bound of the number of sign sequences of real polynomials.
Original language | English |
---|---|
Pages (from-to) | 589-598 |
Number of pages | 10 |
Journal | Discrete and Computational Geometry |
Volume | 44 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2010 |
Keywords
- Principal component analysis
- VC dimensions
- Warren's bound
ASJC Scopus subject areas
- Theoretical Computer Science
- Geometry and Topology
- Discrete Mathematics and Combinatorics
- Computational Theory and Mathematics