Long-term CPU load prediction system for scheduling of distributed processes and its implementation

Yoshihiro Sugaya, Hiroshi Tatsumi, Mitiharu Kobayashi, Hirotomo Aso

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

There exist distributed processing environments composed of many heterogeneous computers. It is required to schedule distributed parallel processes in an appropriate manner. For the scheduling, prediction of execution load of a process is effective to exploit resources of environments. We propose long-term load prediction methods with references of properties of processes and of runtime predictions. Since an appropriate prediction method is different according to the situation, we propose a prediction module selection to select an appropriate prediction method according to a state of changing CPU load using a neural network. We also discuss about the implementation of a long-term CPU load prediction system, which provides information including prediction of load for schedulers, system administrators and users.

Original languageEnglish
Title of host publicationProceedings - 22nd International Conference on Advanced Information Networking and Applications, AINA 2008
Pages971-977
Number of pages7
DOIs
Publication statusPublished - 2008 Sep 1
Event22nd International Conference on Advanced Information Networking and Applications, AINA 2008 - Gino-wan, Okinawa, Japan
Duration: 2008 Mar 252008 Mar 28

Publication series

NameProceedings - International Conference on Advanced Information Networking and Applications, AINA
ISSN (Print)1550-445X

Other

Other22nd International Conference on Advanced Information Networking and Applications, AINA 2008
CountryJapan
CityGino-wan, Okinawa
Period08/3/2508/3/28

ASJC Scopus subject areas

  • Engineering(all)

Fingerprint Dive into the research topics of 'Long-term CPU load prediction system for scheduling of distributed processes and its implementation'. Together they form a unique fingerprint.

Cite this