Complexity of computing generalized VC-dimensions

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)


In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the polynomial-sample learnability of a class of binary functions. For a class of {0,…, m}-valued functions, the notion has been generalized in various ways. This paper investigates the complexity of computing some of generalized VC-dimensions: VC*-dimension, Ψ*-dimension, and ΨG-dimension. For each dimension, we consider a decision problem that is, for a given matrix representing a class F of functions and an integer K, to determine whether the dimension of F is greater than K or not. We prove that the VC*-dimension problem is polynomial-time reducible to the satisfiability problem of length J with O(log2J) variables, which includes the original VC-dimension problem as a special case. We also show that the ΨG-dimension problem is still reducible to the satisfiability problem of length J with O(log2 J), while the Ψ*-dimension problem becomes NP-complete.

Original languageEnglish
Title of host publicationMachine Learning
Subtitle of host publicationECML 1994 - European Conference on Machine Learning, Proceedings
EditorsFrancesco Bergadano, Luc De Raedt
PublisherSpringer Verlag
Number of pages4
ISBN (Print)9783540578680
Publication statusPublished - 1994
Externally publishedYes
EventEuropean Conference on Machine Learning, ECML 1994 - Catania, Italy
Duration: 1994 Apr 61994 Apr 8

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume784 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


OtherEuropean Conference on Machine Learning, ECML 1994

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)


Dive into the research topics of 'Complexity of computing generalized VC-dimensions'. Together they form a unique fingerprint.

Cite this