This paper presents a new algorithm for the problem of robust subspace learning (RSL), i.e., the estimation of linear subspace parameters from a set of data points in the presence of outliers (and missing data). The algorithm is derived on the basis of the variational Bayes (VB) method, which is a Bayesian generalization of the EM algorithm. For the purpose of the derivation of the algorithm as well as the comparison with existing algorithms, we present two formulations of the EM algorithm for RSL. One yields a variant of the IRLS algorithm, which is the standard algorithm for RSL. The other is an extension of Roweis's formulation of an EM algorithm for PCA, which yields a robust version of the alternated least squares (ALS) algorithm. This ALS-based algorithm can only deal with a certain type of outliers (termed vector-wise outliers). The VB method is used to resolve this limitation, which results in the proposed algorithm. Experimental results using synthetic data show that the proposed algorithm outperforms the IRLS algorithm in terms of the convergence property and the computational time.