The entropy of normalized partial distortions, referred to as the partial distortion entropy, is a very useful metric for online optimality evaluation of a vector quantization codebook. However, it is computationally expensive to update the partial distortion entropy so that it adapts to the changes in a codebook. Hence, its naive computation results in considerably increasing the cost of vector quantization codebook design, especially in the case of a large codebook. This paper presents a novel scheme of the partial distortion entropy updating. The proposed scheme is devised to update the partial distortion entropy without fully recalculating it. Therefore, the scheme requires much less computation than the recalculation, and its computational cost is constant irrespective of a codebook size. Experimental results clearly show that the proposed scheme is significantly effective for saving the computational cost of vector quantization codebook design with the partial distortion entropy.