Gamification has been emerging as a pedagogical tool over the past few years, and numerous studies report positive outcomes from games applied in educational environments. However, researchers rarely discuss the gamification development process. Little work has been done to analyze their gamification in terms of usability, game elements, etc.. In addition, in most previous studies, students are the end user, and only get involved in the final test to provide data on motivation, engagement or learning outcomes. Under the circumstances, the following questions are left unaddressed: how to evaluate the effectiveness of a gamification product in education? What do students learn when they create and critique gamification products? This paper proposes a peer-based gamification critique process based on peer-developed game products. We expect such a process will provide valuable feedback from an end-user perspective and that the outcomes will help to answer the above questions. This present study is an extension of a previous research cycle in which end users (students) developed gamification products to help students learn challenging concepts in industrial engineering courses. We selected four final gamification products for further evaluation: "Avengers", "Bake-off-453", "Gulf games" and "DungeoNIOSH". These games are intended to teach the concepts of: "Discrete probability distributions", "Gulf of evaluation vs. Gulf of execution", "Interaction effects" and "NIOSH Lifting equation". The first two are basic concepts in statistics, and the last two relate to the human factor/ergonomics domain. In this study, we had two student teams conduct a critique of these gamification products as their Capstone project. Peer-based critiques consist of three main steps after matching the teams with their interested game products: firstly, critiquing the gamification product from a game perspective, including metrics like what types of game elements are included, interactivity, motivation, engagement level, et al. secondly, evaluating the gamification products on aspects of education and focusing on the learning effectiveness; finally, critiquing gamification products based on usability guidelines and principles. Student teams were instructed to specify each criterion and cover all three aspects. At the end of this paper, two case studies are presented, showing the final critique criteria developed by student teams. Most importantly, we will collect valuable insights from end users, i.e., what they can learn from the critiquing process, what lessons we can learn from their feedback. These will provide us with meaningful information to help evaluate gamification products designed to enhance engineering concept learning.
|Original language||English (US)|
|Journal||ASEE Annual Conference and Exposition, Conference Proceedings|
|State||Published - Jun 24 2017|
|Event||124th ASEE Annual Conference and Exposition - Columbus, United States|
Duration: Jun 25 2017 → Jun 28 2017
All Science Journal Classification (ASJC) codes