Abstract

Researchers, clinicians, and other professionals are increasingly in need of cost-effective, evidence-based programs and practices. However, these individuals may lack the time and, for some, the required expertise to search for and identify such interventions. To address this concern, several online registers that list or categorize programs according to their empirical evidence of effectiveness have been established. Although these registers are designed to simplify the task of selecting evidence-based interventions, the use of distinct review processes and standards by each register creates discrepancies in final program classifications, which can pose a challenge for users. The present case study highlights three programs that have been evaluated by more than one register and have received similar or different classifications. Reasons for inconsistencies are discussed, and several recommendations for evaluating organizations and register users are provided to enhance the functionality and ease of use of online program registers.

Original languageEnglish (US)
Article number101676
JournalEvaluation and Program Planning
Volume76
DOIs
StatePublished - Oct 1 2019

Fingerprint

Evidence-Based Practice
Research Personnel
Organizations
Costs and Cost Analysis
evidence
functionality
expertise
programme
lack
costs
cost
Evidence-based
time
recommendation
need
Expertise
Inconsistency
Ease of use
Review process
Discrepancy

All Science Journal Classification (ASJC) codes

  • Business and International Management
  • Social Psychology
  • Geography, Planning and Development
  • Strategy and Management
  • Public Health, Environmental and Occupational Health

Cite this

@article{dbecfd8f2d8b46759be93116658ef699,
title = "Similarities and differences in program registers: A case study",
abstract = "Researchers, clinicians, and other professionals are increasingly in need of cost-effective, evidence-based programs and practices. However, these individuals may lack the time and, for some, the required expertise to search for and identify such interventions. To address this concern, several online registers that list or categorize programs according to their empirical evidence of effectiveness have been established. Although these registers are designed to simplify the task of selecting evidence-based interventions, the use of distinct review processes and standards by each register creates discrepancies in final program classifications, which can pose a challenge for users. The present case study highlights three programs that have been evaluated by more than one register and have received similar or different classifications. Reasons for inconsistencies are discussed, and several recommendations for evaluating organizations and register users are provided to enhance the functionality and ease of use of online program registers.",
author = "Zack, {Melissa Kareen} and Jennifer Karre and Jonathan Olson and Perkins, {Daniel Francis}",
year = "2019",
month = "10",
day = "1",
doi = "10.1016/j.evalprogplan.2019.101676",
language = "English (US)",
volume = "76",
journal = "Evaluation and Program Planning",
issn = "0149-7189",
publisher = "Elsevier Limited",

}

Similarities and differences in program registers : A case study. / Zack, Melissa Kareen; Karre, Jennifer; Olson, Jonathan; Perkins, Daniel Francis.

In: Evaluation and Program Planning, Vol. 76, 101676, 01.10.2019.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Similarities and differences in program registers

T2 - A case study

AU - Zack, Melissa Kareen

AU - Karre, Jennifer

AU - Olson, Jonathan

AU - Perkins, Daniel Francis

PY - 2019/10/1

Y1 - 2019/10/1

N2 - Researchers, clinicians, and other professionals are increasingly in need of cost-effective, evidence-based programs and practices. However, these individuals may lack the time and, for some, the required expertise to search for and identify such interventions. To address this concern, several online registers that list or categorize programs according to their empirical evidence of effectiveness have been established. Although these registers are designed to simplify the task of selecting evidence-based interventions, the use of distinct review processes and standards by each register creates discrepancies in final program classifications, which can pose a challenge for users. The present case study highlights three programs that have been evaluated by more than one register and have received similar or different classifications. Reasons for inconsistencies are discussed, and several recommendations for evaluating organizations and register users are provided to enhance the functionality and ease of use of online program registers.

AB - Researchers, clinicians, and other professionals are increasingly in need of cost-effective, evidence-based programs and practices. However, these individuals may lack the time and, for some, the required expertise to search for and identify such interventions. To address this concern, several online registers that list or categorize programs according to their empirical evidence of effectiveness have been established. Although these registers are designed to simplify the task of selecting evidence-based interventions, the use of distinct review processes and standards by each register creates discrepancies in final program classifications, which can pose a challenge for users. The present case study highlights three programs that have been evaluated by more than one register and have received similar or different classifications. Reasons for inconsistencies are discussed, and several recommendations for evaluating organizations and register users are provided to enhance the functionality and ease of use of online program registers.

UR - http://www.scopus.com/inward/record.url?scp=85067686722&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85067686722&partnerID=8YFLogxK

U2 - 10.1016/j.evalprogplan.2019.101676

DO - 10.1016/j.evalprogplan.2019.101676

M3 - Article

C2 - 31252374

AN - SCOPUS:85067686722

VL - 76

JO - Evaluation and Program Planning

JF - Evaluation and Program Planning

SN - 0149-7189

M1 - 101676

ER -