TY - GEN
T1 - Complexity of computing generalized VC-dimensions
AU - Shinohara, Ayumi
N1 - Publisher Copyright:
© Springer-Verlag Berlin Heidelberg 1994.
PY - 1994
Y1 - 1994
N2 - In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the polynomial-sample learnability of a class of binary functions. For a class of {0,…, m}-valued functions, the notion has been generalized in various ways. This paper investigates the complexity of computing some of generalized VC-dimensions: VC*-dimension, Ψ*-dimension, and ΨG-dimension. For each dimension, we consider a decision problem that is, for a given matrix representing a class F of functions and an integer K, to determine whether the dimension of F is greater than K or not. We prove that the VC*-dimension problem is polynomial-time reducible to the satisfiability problem of length J with O(log2J) variables, which includes the original VC-dimension problem as a special case. We also show that the ΨG-dimension problem is still reducible to the satisfiability problem of length J with O(log2 J), while the Ψ*-dimension problem becomes NP-complete.
AB - In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the polynomial-sample learnability of a class of binary functions. For a class of {0,…, m}-valued functions, the notion has been generalized in various ways. This paper investigates the complexity of computing some of generalized VC-dimensions: VC*-dimension, Ψ*-dimension, and ΨG-dimension. For each dimension, we consider a decision problem that is, for a given matrix representing a class F of functions and an integer K, to determine whether the dimension of F is greater than K or not. We prove that the VC*-dimension problem is polynomial-time reducible to the satisfiability problem of length J with O(log2J) variables, which includes the original VC-dimension problem as a special case. We also show that the ΨG-dimension problem is still reducible to the satisfiability problem of length J with O(log2 J), while the Ψ*-dimension problem becomes NP-complete.
UR - http://www.scopus.com/inward/record.url?scp=0347816633&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0347816633&partnerID=8YFLogxK
U2 - 10.1007/3-540-57868-4_87
DO - 10.1007/3-540-57868-4_87
M3 - Conference contribution
AN - SCOPUS:0347816633
SN - 9783540578680
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 415
EP - 418
BT - Machine Learning
A2 - Bergadano, Francesco
A2 - De Raedt, Luc
PB - Springer Verlag
T2 - European Conference on Machine Learning, ECML 1994
Y2 - 6 April 1994 through 8 April 1994
ER -