Peer-based gamification products critiquing: Two case studies in engineering education

Jingwen Li, Eunsik Kim, Alec M. Schultis, Andrew Joseph Kapfer, Jimmy Lin, Peter A. Yake, Domenic M. Erjavec, Benjamin Dabat, Ling Rothrock

Research output: Contribution to journalArticle

Abstract

Gamification has been emerging as a pedagogical tool over the past few years, and numerous studies report positive outcomes from games applied in educational environments. However, researchers rarely discuss the gamification development process. Little work has been done to analyze their gamification in terms of usability, game elements, etc.. In addition, in most previous studies, students are the end user, and only get involved in the final test to provide data on motivation, engagement or learning outcomes. Under the circumstances, the following questions are left unaddressed: how to evaluate the effectiveness of a gamification product in education? What do students learn when they create and critique gamification products? This paper proposes a peer-based gamification critique process based on peer-developed game products. We expect such a process will provide valuable feedback from an end-user perspective and that the outcomes will help to answer the above questions. This present study is an extension of a previous research cycle in which end users (students) developed gamification products to help students learn challenging concepts in industrial engineering courses. We selected four final gamification products for further evaluation: "Avengers", "Bake-off-453", "Gulf games" and "DungeoNIOSH". These games are intended to teach the concepts of: "Discrete probability distributions", "Gulf of evaluation vs. Gulf of execution", "Interaction effects" and "NIOSH Lifting equation". The first two are basic concepts in statistics, and the last two relate to the human factor/ergonomics domain. In this study, we had two student teams conduct a critique of these gamification products as their Capstone project. Peer-based critiques consist of three main steps after matching the teams with their interested game products: firstly, critiquing the gamification product from a game perspective, including metrics like what types of game elements are included, interactivity, motivation, engagement level, et al. secondly, evaluating the gamification products on aspects of education and focusing on the learning effectiveness; finally, critiquing gamification products based on usability guidelines and principles. Student teams were instructed to specify each criterion and cover all three aspects. At the end of this paper, two case studies are presented, showing the final critique criteria developed by student teams. Most importantly, we will collect valuable insights from end users, i.e., what they can learn from the critiquing process, what lessons we can learn from their feedback. These will provide us with meaningful information to help evaluate gamification products designed to enhance engineering concept learning.

Original languageEnglish (US)
JournalASEE Annual Conference and Exposition, Conference Proceedings
Volume2017-June
StatePublished - Jun 24 2017

Fingerprint

Engineering education
Students
Education
Feedback
Industrial engineering
Ergonomics
Human engineering
Probability distributions
Statistics

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Li, Jingwen ; Kim, Eunsik ; Schultis, Alec M. ; Kapfer, Andrew Joseph ; Lin, Jimmy ; Yake, Peter A. ; Erjavec, Domenic M. ; Dabat, Benjamin ; Rothrock, Ling. / Peer-based gamification products critiquing : Two case studies in engineering education. In: ASEE Annual Conference and Exposition, Conference Proceedings. 2017 ; Vol. 2017-June.
@article{a06437c74a424157bd59a78df8265404,
title = "Peer-based gamification products critiquing: Two case studies in engineering education",
abstract = "Gamification has been emerging as a pedagogical tool over the past few years, and numerous studies report positive outcomes from games applied in educational environments. However, researchers rarely discuss the gamification development process. Little work has been done to analyze their gamification in terms of usability, game elements, etc.. In addition, in most previous studies, students are the end user, and only get involved in the final test to provide data on motivation, engagement or learning outcomes. Under the circumstances, the following questions are left unaddressed: how to evaluate the effectiveness of a gamification product in education? What do students learn when they create and critique gamification products? This paper proposes a peer-based gamification critique process based on peer-developed game products. We expect such a process will provide valuable feedback from an end-user perspective and that the outcomes will help to answer the above questions. This present study is an extension of a previous research cycle in which end users (students) developed gamification products to help students learn challenging concepts in industrial engineering courses. We selected four final gamification products for further evaluation: {"}Avengers{"}, {"}Bake-off-453{"}, {"}Gulf games{"} and {"}DungeoNIOSH{"}. These games are intended to teach the concepts of: {"}Discrete probability distributions{"}, {"}Gulf of evaluation vs. Gulf of execution{"}, {"}Interaction effects{"} and {"}NIOSH Lifting equation{"}. The first two are basic concepts in statistics, and the last two relate to the human factor/ergonomics domain. In this study, we had two student teams conduct a critique of these gamification products as their Capstone project. Peer-based critiques consist of three main steps after matching the teams with their interested game products: firstly, critiquing the gamification product from a game perspective, including metrics like what types of game elements are included, interactivity, motivation, engagement level, et al. secondly, evaluating the gamification products on aspects of education and focusing on the learning effectiveness; finally, critiquing gamification products based on usability guidelines and principles. Student teams were instructed to specify each criterion and cover all three aspects. At the end of this paper, two case studies are presented, showing the final critique criteria developed by student teams. Most importantly, we will collect valuable insights from end users, i.e., what they can learn from the critiquing process, what lessons we can learn from their feedback. These will provide us with meaningful information to help evaluate gamification products designed to enhance engineering concept learning.",
author = "Jingwen Li and Eunsik Kim and Schultis, {Alec M.} and Kapfer, {Andrew Joseph} and Jimmy Lin and Yake, {Peter A.} and Erjavec, {Domenic M.} and Benjamin Dabat and Ling Rothrock",
year = "2017",
month = "6",
day = "24",
language = "English (US)",
volume = "2017-June",
journal = "ASEE Annual Conference and Exposition, Conference Proceedings",
issn = "2153-5965",

}

Li, J, Kim, E, Schultis, AM, Kapfer, AJ, Lin, J, Yake, PA, Erjavec, DM, Dabat, B & Rothrock, L 2017, 'Peer-based gamification products critiquing: Two case studies in engineering education', ASEE Annual Conference and Exposition, Conference Proceedings, vol. 2017-June.

Peer-based gamification products critiquing : Two case studies in engineering education. / Li, Jingwen; Kim, Eunsik; Schultis, Alec M.; Kapfer, Andrew Joseph; Lin, Jimmy; Yake, Peter A.; Erjavec, Domenic M.; Dabat, Benjamin; Rothrock, Ling.

In: ASEE Annual Conference and Exposition, Conference Proceedings, Vol. 2017-June, 24.06.2017.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Peer-based gamification products critiquing

T2 - Two case studies in engineering education

AU - Li, Jingwen

AU - Kim, Eunsik

AU - Schultis, Alec M.

AU - Kapfer, Andrew Joseph

AU - Lin, Jimmy

AU - Yake, Peter A.

AU - Erjavec, Domenic M.

AU - Dabat, Benjamin

AU - Rothrock, Ling

PY - 2017/6/24

Y1 - 2017/6/24

N2 - Gamification has been emerging as a pedagogical tool over the past few years, and numerous studies report positive outcomes from games applied in educational environments. However, researchers rarely discuss the gamification development process. Little work has been done to analyze their gamification in terms of usability, game elements, etc.. In addition, in most previous studies, students are the end user, and only get involved in the final test to provide data on motivation, engagement or learning outcomes. Under the circumstances, the following questions are left unaddressed: how to evaluate the effectiveness of a gamification product in education? What do students learn when they create and critique gamification products? This paper proposes a peer-based gamification critique process based on peer-developed game products. We expect such a process will provide valuable feedback from an end-user perspective and that the outcomes will help to answer the above questions. This present study is an extension of a previous research cycle in which end users (students) developed gamification products to help students learn challenging concepts in industrial engineering courses. We selected four final gamification products for further evaluation: "Avengers", "Bake-off-453", "Gulf games" and "DungeoNIOSH". These games are intended to teach the concepts of: "Discrete probability distributions", "Gulf of evaluation vs. Gulf of execution", "Interaction effects" and "NIOSH Lifting equation". The first two are basic concepts in statistics, and the last two relate to the human factor/ergonomics domain. In this study, we had two student teams conduct a critique of these gamification products as their Capstone project. Peer-based critiques consist of three main steps after matching the teams with their interested game products: firstly, critiquing the gamification product from a game perspective, including metrics like what types of game elements are included, interactivity, motivation, engagement level, et al. secondly, evaluating the gamification products on aspects of education and focusing on the learning effectiveness; finally, critiquing gamification products based on usability guidelines and principles. Student teams were instructed to specify each criterion and cover all three aspects. At the end of this paper, two case studies are presented, showing the final critique criteria developed by student teams. Most importantly, we will collect valuable insights from end users, i.e., what they can learn from the critiquing process, what lessons we can learn from their feedback. These will provide us with meaningful information to help evaluate gamification products designed to enhance engineering concept learning.

AB - Gamification has been emerging as a pedagogical tool over the past few years, and numerous studies report positive outcomes from games applied in educational environments. However, researchers rarely discuss the gamification development process. Little work has been done to analyze their gamification in terms of usability, game elements, etc.. In addition, in most previous studies, students are the end user, and only get involved in the final test to provide data on motivation, engagement or learning outcomes. Under the circumstances, the following questions are left unaddressed: how to evaluate the effectiveness of a gamification product in education? What do students learn when they create and critique gamification products? This paper proposes a peer-based gamification critique process based on peer-developed game products. We expect such a process will provide valuable feedback from an end-user perspective and that the outcomes will help to answer the above questions. This present study is an extension of a previous research cycle in which end users (students) developed gamification products to help students learn challenging concepts in industrial engineering courses. We selected four final gamification products for further evaluation: "Avengers", "Bake-off-453", "Gulf games" and "DungeoNIOSH". These games are intended to teach the concepts of: "Discrete probability distributions", "Gulf of evaluation vs. Gulf of execution", "Interaction effects" and "NIOSH Lifting equation". The first two are basic concepts in statistics, and the last two relate to the human factor/ergonomics domain. In this study, we had two student teams conduct a critique of these gamification products as their Capstone project. Peer-based critiques consist of three main steps after matching the teams with their interested game products: firstly, critiquing the gamification product from a game perspective, including metrics like what types of game elements are included, interactivity, motivation, engagement level, et al. secondly, evaluating the gamification products on aspects of education and focusing on the learning effectiveness; finally, critiquing gamification products based on usability guidelines and principles. Student teams were instructed to specify each criterion and cover all three aspects. At the end of this paper, two case studies are presented, showing the final critique criteria developed by student teams. Most importantly, we will collect valuable insights from end users, i.e., what they can learn from the critiquing process, what lessons we can learn from their feedback. These will provide us with meaningful information to help evaluate gamification products designed to enhance engineering concept learning.

UR - http://www.scopus.com/inward/record.url?scp=85030552154&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85030552154&partnerID=8YFLogxK

M3 - Article

AN - SCOPUS:85030552154

VL - 2017-June

JO - ASEE Annual Conference and Exposition, Conference Proceedings

JF - ASEE Annual Conference and Exposition, Conference Proceedings

SN - 2153-5965

ER -