The Vapnik-Chervonenkis (VC) dimension is known to be the crucial measure of the polynomial-sample learnability in the PAC-learning model. This paper investigates the complexity of computing VC-dimension of a concept class over a finite learning domain. We consider a decision problem called the discrete VC-dimension problem which is, for a given matrix representing a concept class F and an integer K, to determine whether the VC-dimension of F is greater than K or not. We prove that (1) the discrete VC-dimension problem is polynomial-time reducible to the satisfiability problem of length J with O(log2J) variables, and (2) for every constant C, the satisfiability problem in conjunctive normal form with m clauses and Clog2m variables is polynomial-time reducible to the discrete VC-dimension problem. These results can be interpreted, in some sense, that the problem is “complete” for the class of nO(log n time computable sets.