Interrater agreement on the benton visual retention test

G. E. Swan, E. Morrison, Paul Eslinger

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

To address overlapping needs in clinical neuropsychology and epidemiology of the elderly, we report an in-depth analysis of interrater scoring for a test commonly used to assess visual memory in older persons. Benton Visual Retention Test protocols (Form C, Administration A) from 277 community-dwelling male participants (M=65.2 years) in two ongoing cardiovascular epidemiologic studies were scored independently by two trained raters. Interrater reliabilities, calculated as intraclass correlations, were .963 and .974 for total number of correct reproductions and total number of errors, respectively. Interrater agreements on categorical determinations of the presence or absence of 6 error codes on 10 separate designs were evaluated using kappa measures of agreement. Kappa values for each of the 10 designs ranged between .780 and .930. Kappa values for each of the error codes ranged from a high of .976 for omissions to a moderate .737 for size errors. Kappa values by error code type within each of the 10 designs revealed particular problem areas for misplacements on design 9 and size errors on design 10 (kappa being as low as .440 and .480 respectively). Suggestions for improving the accuracy of scoring are presented.

Original languageEnglish (US)
Pages (from-to)37-44
Number of pages8
JournalClinical Neuropsychologist
Volume4
Issue number1
DOIs
StatePublished - Jan 1 1990

Fingerprint

Independent Living
Neuropsychology
Reproduction
Epidemiologic Studies
Epidemiology
Retention (Psychology)
Interrater Agreement

All Science Journal Classification (ASJC) codes

  • Neuropsychology and Physiological Psychology
  • Developmental and Educational Psychology
  • Clinical Psychology
  • Arts and Humanities (miscellaneous)
  • Psychiatry and Mental health

Cite this

Swan, G. E. ; Morrison, E. ; Eslinger, Paul. / Interrater agreement on the benton visual retention test. In: Clinical Neuropsychologist. 1990 ; Vol. 4, No. 1. pp. 37-44.
@article{f4e7e40a63354fb793a3d1c1970a89f9,
title = "Interrater agreement on the benton visual retention test",
abstract = "To address overlapping needs in clinical neuropsychology and epidemiology of the elderly, we report an in-depth analysis of interrater scoring for a test commonly used to assess visual memory in older persons. Benton Visual Retention Test protocols (Form C, Administration A) from 277 community-dwelling male participants (M=65.2 years) in two ongoing cardiovascular epidemiologic studies were scored independently by two trained raters. Interrater reliabilities, calculated as intraclass correlations, were .963 and .974 for total number of correct reproductions and total number of errors, respectively. Interrater agreements on categorical determinations of the presence or absence of 6 error codes on 10 separate designs were evaluated using kappa measures of agreement. Kappa values for each of the 10 designs ranged between .780 and .930. Kappa values for each of the error codes ranged from a high of .976 for omissions to a moderate .737 for size errors. Kappa values by error code type within each of the 10 designs revealed particular problem areas for misplacements on design 9 and size errors on design 10 (kappa being as low as .440 and .480 respectively). Suggestions for improving the accuracy of scoring are presented.",
author = "Swan, {G. E.} and E. Morrison and Paul Eslinger",
year = "1990",
month = "1",
day = "1",
doi = "10.1080/13854049008401495",
language = "English (US)",
volume = "4",
pages = "37--44",
journal = "Clinical Neuropsychologist",
issn = "0920-1637",
publisher = "Swets & Zeitlinger",
number = "1",

}

Interrater agreement on the benton visual retention test. / Swan, G. E.; Morrison, E.; Eslinger, Paul.

In: Clinical Neuropsychologist, Vol. 4, No. 1, 01.01.1990, p. 37-44.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Interrater agreement on the benton visual retention test

AU - Swan, G. E.

AU - Morrison, E.

AU - Eslinger, Paul

PY - 1990/1/1

Y1 - 1990/1/1

N2 - To address overlapping needs in clinical neuropsychology and epidemiology of the elderly, we report an in-depth analysis of interrater scoring for a test commonly used to assess visual memory in older persons. Benton Visual Retention Test protocols (Form C, Administration A) from 277 community-dwelling male participants (M=65.2 years) in two ongoing cardiovascular epidemiologic studies were scored independently by two trained raters. Interrater reliabilities, calculated as intraclass correlations, were .963 and .974 for total number of correct reproductions and total number of errors, respectively. Interrater agreements on categorical determinations of the presence or absence of 6 error codes on 10 separate designs were evaluated using kappa measures of agreement. Kappa values for each of the 10 designs ranged between .780 and .930. Kappa values for each of the error codes ranged from a high of .976 for omissions to a moderate .737 for size errors. Kappa values by error code type within each of the 10 designs revealed particular problem areas for misplacements on design 9 and size errors on design 10 (kappa being as low as .440 and .480 respectively). Suggestions for improving the accuracy of scoring are presented.

AB - To address overlapping needs in clinical neuropsychology and epidemiology of the elderly, we report an in-depth analysis of interrater scoring for a test commonly used to assess visual memory in older persons. Benton Visual Retention Test protocols (Form C, Administration A) from 277 community-dwelling male participants (M=65.2 years) in two ongoing cardiovascular epidemiologic studies were scored independently by two trained raters. Interrater reliabilities, calculated as intraclass correlations, were .963 and .974 for total number of correct reproductions and total number of errors, respectively. Interrater agreements on categorical determinations of the presence or absence of 6 error codes on 10 separate designs were evaluated using kappa measures of agreement. Kappa values for each of the 10 designs ranged between .780 and .930. Kappa values for each of the error codes ranged from a high of .976 for omissions to a moderate .737 for size errors. Kappa values by error code type within each of the 10 designs revealed particular problem areas for misplacements on design 9 and size errors on design 10 (kappa being as low as .440 and .480 respectively). Suggestions for improving the accuracy of scoring are presented.

UR - http://www.scopus.com/inward/record.url?scp=0025308764&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=0025308764&partnerID=8YFLogxK

U2 - 10.1080/13854049008401495

DO - 10.1080/13854049008401495

M3 - Article

VL - 4

SP - 37

EP - 44

JO - Clinical Neuropsychologist

JF - Clinical Neuropsychologist

SN - 0920-1637

IS - 1

ER -