Customer behavior prediction system by large scale data fusion in a retail service

Ishigaki Tsukasa, Takeshi Takenaka, Yoichi Motomura

Research output: Contribution to journalArticle

8 Citations (Scopus)

Abstract

This paper describes a computational customer behavior modeling by Bayesian network with an appropriate category. Categories are generated by a heterogeneous data fusion using an ID-POS data and customer's questionnaire responses with respect to their lifestyle. We propose a latent class model that is an extension of PLSI model. In the proposed model, customers and items are classified probabilistically into some latent lifestyle categories and latent item category. We show that the performance of the proposed model is superior to that of the k-means and PLSI in terms of category mining. We produce a Bayesian network model including the customer and item categories, situations and conditions of purchases. Based on that network structure, we can systematically identify useful knowledge for use in sustainable services. In the retail service, knowledge management with point of sales data mining is integral to maintaining and improving productivity. This method provides useful knowledge based on the ID-POS data for efficient customer relationship management and can be applicable for other service industries. This method is applicable for marketing support, service modeling, and decision making in various business fields, including retail services.

Original languageEnglish
Pages (from-to)670-681
Number of pages12
JournalTransactions of the Japanese Society for Artificial Intelligence
Volume26
Issue number6
DOIs
Publication statusPublished - 2011 Oct 24

Keywords

  • Bayesian network
  • ID-POS data
  • Large scale data modeling
  • Latent class model
  • Service engineering

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint Dive into the research topics of 'Customer behavior prediction system by large scale data fusion in a retail service'. Together they form a unique fingerprint.

Cite this