Fast optimization of language model weight and insertion penalty from n-best candidates

Akinori Ito, Masaki Kohda, Shozo Makino

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

A new idea for preselecting n-best candidates to make n-best-based parameter optimization faster was described. The method enables the number of n-best candidates to be reduced by more than 90% and makes the optimization process about 9-28 times faster. An algorithm for preselection of n-best candidates was proposed. The use of this algorithm makes the optimization time 9-28 times faster without changing the optimization result. The optimum candidate among n-best candidates was determined. The optimization time with preselection was about 9 times faster than that without preselection under the 100-best condition, and 28 times faster under the 1,000-best condition. N-best candidates are reduced to the number of points on the surface of the polyhedron.

Original languageEnglish
Pages (from-to)384-387
Number of pages4
JournalAcoustical Science and Technology
Volume26
Issue number4
DOIs
Publication statusPublished - 2005 Jul

Keywords

  • Insertion penalty
  • Language model weight
  • N-best hypothesis
  • Optimization

ASJC Scopus subject areas

  • Acoustics and Ultrasonics

Fingerprint Dive into the research topics of 'Fast optimization of language model weight and insertion penalty from n-best candidates'. Together they form a unique fingerprint.

Cite this