Numerosity reduction for resource constrained learning

Khamisi Kalegele, Hideyuki Takahashi, Johan Sveholm, Kazuto Sasai, Gen Kitagata, Tetsuo Kinoshita

Research output: Contribution to journalArticlepeer-review

Abstract

When coupling data mining (DM) and learning agents, one of the crucial challenges is the need for the Knowledge Extraction (KE) process to be lightweight enough so that even resource (e.g., memory, CPU etc.) constrained agents are able to extract knowledge. We propose the Stratified Ordered Selection (SOS) method for achieving lightweight KE using dynamic numerosity reduction of training examples. SOS allows for agents to retrieve differentsized training subsets based on available resources. The method employs ranking-based subset selection using a novel Level Order (LO) ranking scheme. We show representativeness of subsets selected using the proposed method, its noise tolerance nature and ability to preserve KE performance over different reduction levels. When compared to subset selection methods of the same category, the proposed method offers the best trade-off between cost, reduction and the ability to preserve performance.

Original languageEnglish
Pages (from-to)329-341
Number of pages13
JournalJournal of information processing
Volume21
Issue number2
DOIs
Publication statusPublished - 2013 Apr

Keywords

  • Agent learning
  • Data reduction
  • Instance ranking
  • Machine learning

ASJC Scopus subject areas

  • Computer Science(all)

Fingerprint Dive into the research topics of 'Numerosity reduction for resource constrained learning'. Together they form a unique fingerprint.

Cite this