Work in progress - Comparing assessment tools in computer science education: Empirical analysis

Pavel Azalov, Stephen Azaloff, Fani Zlatarova

Research output: Contribution to journalConference articlepeer-review

3 Scopus citations

Abstract

The paper discusses the usage of multiple-choice and open-ended constructed response questions for measuring the performance of undergraduate students in computer science courses. A goal of this research is to investigate if "guessing" plays a significant role in multiple-choice answering. An experiment was constructed in two academic institutions over a two-semester period involving students in four introductory programming classes. The quizzes consisted of pairs of similar questions, where each pair contained a multiple-choice and an open-ended constructed response question of equal difficulty and weight.

Original languageEnglish (US)
Pages (from-to)F2G-18-F2G-19
JournalProceedings - Frontiers in Education Conference, FIE
Volume2
StatePublished - Dec 1 2004
Event34th Annual Frontiers in Education: Expanding Educational Opportunities Through Partnerships and Distance Learning - Conference Proceedings, FIE - Savannah, GA, United States
Duration: Oct 20 2004Oct 23 2004

All Science Journal Classification (ASJC) codes

  • Software
  • Education
  • Computer Science Applications

Fingerprint Dive into the research topics of 'Work in progress - Comparing assessment tools in computer science education: Empirical analysis'. Together they form a unique fingerprint.

Cite this