Performance assessment in elementary engineering: Evaluating student (RTP)

Cathy P. Lachapelle, Christine Cunningham

Research output: Contribution to journalConference article

1 Citation (Scopus)

Abstract

With the new emphasis on engineering practices and engineering design (NGSS Lead States, 2013), teachers of and researchers studying K-12 engineering need to find ways to measure students' developing engineering skills. To efficiently measure student learning of engineering practices, there is need for a tool to capture student performances in a way that readily affords evaluation. The problem we pursue in this paper is how to accomplish this measurement. In this paper, we present a performance assessment instrument and coding rubric. We calculate interrater reliability for coders and present descriptive statistics for student scores to demonstrate the utility of the instrument for distinguishing a range of performances. We conduct an exploratory factor analysis to examine the internal structure of the unit and calculate internal consistency reliability. To build a case for validity for use of the assessment to measure student learning of engineering practices, we compare video of 30 students working on design challenges in their student groups, collected from 10 of the participating classrooms, to the same students' performance on the assessment. This also informs the use and limits of utility of the written performance assessment for measuring elementary students' engineering skills and understanding-in-use. Finally, we describe the time needed to score the assessments, and discuss its utility for larger-scale research studies.

Original languageEnglish (US)
JournalASEE Annual Conference and Exposition, Conference Proceedings
Volume2016-June
StatePublished - Jun 26 2016
Event123rd ASEE Annual Conference and Exposition - New Orleans, United States
Duration: Jun 26 2016Jun 29 2016

Fingerprint

Students
Factor analysis
Statistics

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

@article{3f0bcec839ff4ad29244f018a4f4c2a9,
title = "Performance assessment in elementary engineering: Evaluating student (RTP)",
abstract = "With the new emphasis on engineering practices and engineering design (NGSS Lead States, 2013), teachers of and researchers studying K-12 engineering need to find ways to measure students' developing engineering skills. To efficiently measure student learning of engineering practices, there is need for a tool to capture student performances in a way that readily affords evaluation. The problem we pursue in this paper is how to accomplish this measurement. In this paper, we present a performance assessment instrument and coding rubric. We calculate interrater reliability for coders and present descriptive statistics for student scores to demonstrate the utility of the instrument for distinguishing a range of performances. We conduct an exploratory factor analysis to examine the internal structure of the unit and calculate internal consistency reliability. To build a case for validity for use of the assessment to measure student learning of engineering practices, we compare video of 30 students working on design challenges in their student groups, collected from 10 of the participating classrooms, to the same students' performance on the assessment. This also informs the use and limits of utility of the written performance assessment for measuring elementary students' engineering skills and understanding-in-use. Finally, we describe the time needed to score the assessments, and discuss its utility for larger-scale research studies.",
author = "Lachapelle, {Cathy P.} and Christine Cunningham",
year = "2016",
month = "6",
day = "26",
language = "English (US)",
volume = "2016-June",
journal = "ASEE Annual Conference and Exposition, Conference Proceedings",
issn = "2153-5965",

}

Performance assessment in elementary engineering : Evaluating student (RTP). / Lachapelle, Cathy P.; Cunningham, Christine.

In: ASEE Annual Conference and Exposition, Conference Proceedings, Vol. 2016-June, 26.06.2016.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Performance assessment in elementary engineering

T2 - Evaluating student (RTP)

AU - Lachapelle, Cathy P.

AU - Cunningham, Christine

PY - 2016/6/26

Y1 - 2016/6/26

N2 - With the new emphasis on engineering practices and engineering design (NGSS Lead States, 2013), teachers of and researchers studying K-12 engineering need to find ways to measure students' developing engineering skills. To efficiently measure student learning of engineering practices, there is need for a tool to capture student performances in a way that readily affords evaluation. The problem we pursue in this paper is how to accomplish this measurement. In this paper, we present a performance assessment instrument and coding rubric. We calculate interrater reliability for coders and present descriptive statistics for student scores to demonstrate the utility of the instrument for distinguishing a range of performances. We conduct an exploratory factor analysis to examine the internal structure of the unit and calculate internal consistency reliability. To build a case for validity for use of the assessment to measure student learning of engineering practices, we compare video of 30 students working on design challenges in their student groups, collected from 10 of the participating classrooms, to the same students' performance on the assessment. This also informs the use and limits of utility of the written performance assessment for measuring elementary students' engineering skills and understanding-in-use. Finally, we describe the time needed to score the assessments, and discuss its utility for larger-scale research studies.

AB - With the new emphasis on engineering practices and engineering design (NGSS Lead States, 2013), teachers of and researchers studying K-12 engineering need to find ways to measure students' developing engineering skills. To efficiently measure student learning of engineering practices, there is need for a tool to capture student performances in a way that readily affords evaluation. The problem we pursue in this paper is how to accomplish this measurement. In this paper, we present a performance assessment instrument and coding rubric. We calculate interrater reliability for coders and present descriptive statistics for student scores to demonstrate the utility of the instrument for distinguishing a range of performances. We conduct an exploratory factor analysis to examine the internal structure of the unit and calculate internal consistency reliability. To build a case for validity for use of the assessment to measure student learning of engineering practices, we compare video of 30 students working on design challenges in their student groups, collected from 10 of the participating classrooms, to the same students' performance on the assessment. This also informs the use and limits of utility of the written performance assessment for measuring elementary students' engineering skills and understanding-in-use. Finally, we describe the time needed to score the assessments, and discuss its utility for larger-scale research studies.

UR - http://www.scopus.com/inward/record.url?scp=84983372834&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84983372834&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84983372834

VL - 2016-June

JO - ASEE Annual Conference and Exposition, Conference Proceedings

JF - ASEE Annual Conference and Exposition, Conference Proceedings

SN - 2153-5965

ER -