We organized a competition entitled “the dialogue system live competition” in which the audience, consisting mainly of researchers in the dialogue community, watched and evaluated a live dialogue conducted between users and dialogue systems. The motivation behind the event was to cultivate state-of-the-art techniques in dialogue systems and enable the dialogue community to share the problems with current dialogue systems. There are two parts to the competition: preliminary selection and live event. In the preliminary selection, eleven systems were evaluated by crowd-sourcing. Three systems proceeded to the live event to perform dialogues with designated speakers and to be evaluated by the audience. This paper describes the design and procedure of the competition, the results of the preliminary selection and live event of the competition, and the problems we identified from the event.