Memory-efficient word embedding vectors

Jun Suzuki, Masaaki Nagata

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)


Word embedding is a technique for identifying the semantic relationships between words by computer. Word embedding vectors enable computers to provide a guess similar to the intuition or common sense of human beings. This article introduces a method for reducing the required memory consumption of this important fundamental operation of word embedding vectors while maintaining the ability to calculate semantic relationships, which is an important property when this technique is applied to real world systems.

Original languageEnglish
JournalNTT Technical Review
Issue number11
Publication statusPublished - 2017 Nov 1
Externally publishedYes


  • Deep learning
  • Natural language processing
  • Word embeddings

ASJC Scopus subject areas

  • Computer Science Applications
  • Computer Networks and Communications
  • Electrical and Electronic Engineering


Dive into the research topics of 'Memory-efficient word embedding vectors'. Together they form a unique fingerprint.

Cite this