## Abstract

In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the polynomial-sample learnability of a class of {0, 1}-valued functions. For a class of {0, ..., N}-valued functions, the notion has been generalized in various ways. This paper investigates the complexity of computing VC-dimension and generalized dimensions: VC^{*}-dimension, Ψ_{*}-dimension, and Ψ_{G}-dimension. For each dimension, we consider a decision problem that is, for a given matrix representing a class F of functions and an integer K, to determine whether the dimension of F is greater than K or not. We prove that (1) both the VC^{*}-dimension and Ψ_{G}-dimension problems are polynomial-time reducible to the satisfiability problem of length J with O(log^{2} J) variables, which include the original VC-dimension problem as a special case, (2) for every constant C, the satisfiability problem in conjunctive normal form with m clauses and C log^{2} m variables is polynomial-time reducible to the VC-dimension problem, while (3) Ψ_{*}-dimension problem is NP-complete.

Original language | English |
---|---|

Pages (from-to) | 129-144 |

Number of pages | 16 |

Journal | Theoretical Computer Science |

Volume | 137 |

Issue number | 1 |

DOIs | |

Publication status | Published - 1995 Jan 9 |

Externally published | Yes |

## ASJC Scopus subject areas

- Theoretical Computer Science
- Computer Science(all)