Psychometric analysis of residence and MOOC assessments

Eric Loken, Zita Oravecz, Conrad S. Tucker, Fridolin Jakob Linder

Research output: Contribution to journalConference article

Abstract

Undergraduate STEM programs are faced with the daunting challenge of managing instruction and assessment for classes that enroll thousands of students per year, and the bulk of student assessment is often determined by multiple choice tests. Instructors try to monitor the reliability metrics and diagnostics for item quality, but rarely is there a more formal evaluation of the psychometric properties of these assessments. College assessment strategies seem to be dominated by a common-sense view of testing that is generally unconcerned about precision of measurement. We see an opportunity to have an impact on undergraduate science instruction by incorporating more rigorous measurement models for testing, and using them to assist instructional goals and assessment. We apply item response theory to analyze tests from two undergraduate STEM classes, a resident instruction physics class and a Massive Open Online Course (MOOC) in geography. We evaluate whether the tests are equally informative across levels of student proficiency, and we demonstrate how precision could be improved with adaptive testing. We find that the measurement precision of multiple choice tests appears to be greatest in the lower half of the class distribution, a property that has consequences for assessment of mastery and for evaluating testing interventions.

Original languageEnglish (US)
JournalASEE Annual Conference and Exposition, Conference Proceedings
Volume122nd ASEE Annual Conference and Exposition: Making Value for Society
Issue number122nd ASEE Annual Conference and Exposition: Making Value for...
StatePublished - Jan 1 2015
Event2015 122nd ASEE Annual Conference and Exposition - Seattle, United States
Duration: Jun 14 2015Jun 17 2015

Fingerprint

Testing
Students
Physics

All Science Journal Classification (ASJC) codes

  • Engineering(all)

Cite this

Loken, E., Oravecz, Z., Tucker, C. S., & Linder, F. J. (2015). Psychometric analysis of residence and MOOC assessments. ASEE Annual Conference and Exposition, Conference Proceedings, 122nd ASEE Annual Conference and Exposition: Making Value for Society(122nd ASEE Annual Conference and Exposition: Making Value for...).
Loken, Eric ; Oravecz, Zita ; Tucker, Conrad S. ; Linder, Fridolin Jakob. / Psychometric analysis of residence and MOOC assessments. In: ASEE Annual Conference and Exposition, Conference Proceedings. 2015 ; Vol. 122nd ASEE Annual Conference and Exposition: Making Value for Society, No. 122nd ASEE Annual Conference and Exposition: Making Value for...
@article{84855aa1d5674afba0aebd32314dbde7,
title = "Psychometric analysis of residence and MOOC assessments",
abstract = "Undergraduate STEM programs are faced with the daunting challenge of managing instruction and assessment for classes that enroll thousands of students per year, and the bulk of student assessment is often determined by multiple choice tests. Instructors try to monitor the reliability metrics and diagnostics for item quality, but rarely is there a more formal evaluation of the psychometric properties of these assessments. College assessment strategies seem to be dominated by a common-sense view of testing that is generally unconcerned about precision of measurement. We see an opportunity to have an impact on undergraduate science instruction by incorporating more rigorous measurement models for testing, and using them to assist instructional goals and assessment. We apply item response theory to analyze tests from two undergraduate STEM classes, a resident instruction physics class and a Massive Open Online Course (MOOC) in geography. We evaluate whether the tests are equally informative across levels of student proficiency, and we demonstrate how precision could be improved with adaptive testing. We find that the measurement precision of multiple choice tests appears to be greatest in the lower half of the class distribution, a property that has consequences for assessment of mastery and for evaluating testing interventions.",
author = "Eric Loken and Zita Oravecz and Tucker, {Conrad S.} and Linder, {Fridolin Jakob}",
year = "2015",
month = "1",
day = "1",
language = "English (US)",
volume = "122nd ASEE Annual Conference and Exposition: Making Value for Society",
journal = "ASEE Annual Conference and Exposition, Conference Proceedings",
issn = "2153-5965",
number = "122nd ASEE Annual Conference and Exposition: Making Value for...",

}

Loken, E, Oravecz, Z, Tucker, CS & Linder, FJ 2015, 'Psychometric analysis of residence and MOOC assessments', ASEE Annual Conference and Exposition, Conference Proceedings, vol. 122nd ASEE Annual Conference and Exposition: Making Value for Society, no. 122nd ASEE Annual Conference and Exposition: Making Value for....

Psychometric analysis of residence and MOOC assessments. / Loken, Eric; Oravecz, Zita; Tucker, Conrad S.; Linder, Fridolin Jakob.

In: ASEE Annual Conference and Exposition, Conference Proceedings, Vol. 122nd ASEE Annual Conference and Exposition: Making Value for Society, No. 122nd ASEE Annual Conference and Exposition: Making Value for..., 01.01.2015.

Research output: Contribution to journalConference article

TY - JOUR

T1 - Psychometric analysis of residence and MOOC assessments

AU - Loken, Eric

AU - Oravecz, Zita

AU - Tucker, Conrad S.

AU - Linder, Fridolin Jakob

PY - 2015/1/1

Y1 - 2015/1/1

N2 - Undergraduate STEM programs are faced with the daunting challenge of managing instruction and assessment for classes that enroll thousands of students per year, and the bulk of student assessment is often determined by multiple choice tests. Instructors try to monitor the reliability metrics and diagnostics for item quality, but rarely is there a more formal evaluation of the psychometric properties of these assessments. College assessment strategies seem to be dominated by a common-sense view of testing that is generally unconcerned about precision of measurement. We see an opportunity to have an impact on undergraduate science instruction by incorporating more rigorous measurement models for testing, and using them to assist instructional goals and assessment. We apply item response theory to analyze tests from two undergraduate STEM classes, a resident instruction physics class and a Massive Open Online Course (MOOC) in geography. We evaluate whether the tests are equally informative across levels of student proficiency, and we demonstrate how precision could be improved with adaptive testing. We find that the measurement precision of multiple choice tests appears to be greatest in the lower half of the class distribution, a property that has consequences for assessment of mastery and for evaluating testing interventions.

AB - Undergraduate STEM programs are faced with the daunting challenge of managing instruction and assessment for classes that enroll thousands of students per year, and the bulk of student assessment is often determined by multiple choice tests. Instructors try to monitor the reliability metrics and diagnostics for item quality, but rarely is there a more formal evaluation of the psychometric properties of these assessments. College assessment strategies seem to be dominated by a common-sense view of testing that is generally unconcerned about precision of measurement. We see an opportunity to have an impact on undergraduate science instruction by incorporating more rigorous measurement models for testing, and using them to assist instructional goals and assessment. We apply item response theory to analyze tests from two undergraduate STEM classes, a resident instruction physics class and a Massive Open Online Course (MOOC) in geography. We evaluate whether the tests are equally informative across levels of student proficiency, and we demonstrate how precision could be improved with adaptive testing. We find that the measurement precision of multiple choice tests appears to be greatest in the lower half of the class distribution, a property that has consequences for assessment of mastery and for evaluating testing interventions.

UR - http://www.scopus.com/inward/record.url?scp=84941995575&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84941995575&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:84941995575

VL - 122nd ASEE Annual Conference and Exposition: Making Value for Society

JO - ASEE Annual Conference and Exposition, Conference Proceedings

JF - ASEE Annual Conference and Exposition, Conference Proceedings

SN - 2153-5965

IS - 122nd ASEE Annual Conference and Exposition: Making Value for...

ER -

Loken E, Oravecz Z, Tucker CS, Linder FJ. Psychometric analysis of residence and MOOC assessments. ASEE Annual Conference and Exposition, Conference Proceedings. 2015 Jan 1;122nd ASEE Annual Conference and Exposition: Making Value for Society(122nd ASEE Annual Conference and Exposition: Making Value for...).