Updating clinical knowledge: An evaluation of current information alerting services

Scott M. Strayer, Allen F. Shaughnessy, Kenneth S. Yew, Mark Stephens, David C. Slawson

Research output: Contribution to journalArticle

7 Citations (Scopus)

Abstract

Purpose: Clinicians are overwhelmed by the sheer magnitude of new clinical information that is available on a daily basis. Despite the availability of information tools for finding this information and for updating clinical knowledge, no study has examined the quality of current information alerting services. Methods: We developed a 7-item checklist based on the principles of evidence-based medicine and assessed content validity with experts and face validity with practicing clinicians and clinician researchers. A list of clinical information updating tools (push tools) was generated in a systematic fashion and the checklist was used to rate the quality of these tools by two independent raters. Prior to rating all instruments, the raters were trained to achieve good agreement (>80%) by applying the checklist to two sets of three randomly selected tools. Descriptive statistics were used to describe the quality of the identified tools and inter-rater reliability was assessed using Intraclass Correlation (ICC). Results: Eighteen tools were identified using our systematic search. The average quality of these tools was 2.72 (range 0-7). Only two tools met all suggested criteria for quality. Inter-rater reliability for the 7-item checklist was .82 (ICC). Conclusions: We developed a checklist that can be used to reliably assess the quality of clinical information updating tools. We found many shortcomings in currently available clinical knowledge updating tools. Ideally, these tools will evolve in the direction of applying basic evidence-based medicine principles to new medical information in order to increase their usefulness to clinicians.

Original languageEnglish (US)
Pages (from-to)824-831
Number of pages8
JournalInternational Journal of Medical Informatics
Volume79
Issue number12
DOIs
StatePublished - Dec 1 2010

Fingerprint

Information Services
Checklist
Evidence-Based Medicine
Reproducibility of Results
Research Personnel

All Science Journal Classification (ASJC) codes

  • Health Informatics

Cite this

Strayer, Scott M. ; Shaughnessy, Allen F. ; Yew, Kenneth S. ; Stephens, Mark ; Slawson, David C. / Updating clinical knowledge : An evaluation of current information alerting services. In: International Journal of Medical Informatics. 2010 ; Vol. 79, No. 12. pp. 824-831.
@article{9a583e9223a5415987a7711f7c0ba46a,
title = "Updating clinical knowledge: An evaluation of current information alerting services",
abstract = "Purpose: Clinicians are overwhelmed by the sheer magnitude of new clinical information that is available on a daily basis. Despite the availability of information tools for finding this information and for updating clinical knowledge, no study has examined the quality of current information alerting services. Methods: We developed a 7-item checklist based on the principles of evidence-based medicine and assessed content validity with experts and face validity with practicing clinicians and clinician researchers. A list of clinical information updating tools (push tools) was generated in a systematic fashion and the checklist was used to rate the quality of these tools by two independent raters. Prior to rating all instruments, the raters were trained to achieve good agreement (>80{\%}) by applying the checklist to two sets of three randomly selected tools. Descriptive statistics were used to describe the quality of the identified tools and inter-rater reliability was assessed using Intraclass Correlation (ICC). Results: Eighteen tools were identified using our systematic search. The average quality of these tools was 2.72 (range 0-7). Only two tools met all suggested criteria for quality. Inter-rater reliability for the 7-item checklist was .82 (ICC). Conclusions: We developed a checklist that can be used to reliably assess the quality of clinical information updating tools. We found many shortcomings in currently available clinical knowledge updating tools. Ideally, these tools will evolve in the direction of applying basic evidence-based medicine principles to new medical information in order to increase their usefulness to clinicians.",
author = "Strayer, {Scott M.} and Shaughnessy, {Allen F.} and Yew, {Kenneth S.} and Mark Stephens and Slawson, {David C.}",
year = "2010",
month = "12",
day = "1",
doi = "10.1016/j.ijmedinf.2010.08.004",
language = "English (US)",
volume = "79",
pages = "824--831",
journal = "International Journal of Medical Informatics",
issn = "1386-5056",
publisher = "Elsevier Ireland Ltd",
number = "12",

}

Updating clinical knowledge : An evaluation of current information alerting services. / Strayer, Scott M.; Shaughnessy, Allen F.; Yew, Kenneth S.; Stephens, Mark; Slawson, David C.

In: International Journal of Medical Informatics, Vol. 79, No. 12, 01.12.2010, p. 824-831.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Updating clinical knowledge

T2 - An evaluation of current information alerting services

AU - Strayer, Scott M.

AU - Shaughnessy, Allen F.

AU - Yew, Kenneth S.

AU - Stephens, Mark

AU - Slawson, David C.

PY - 2010/12/1

Y1 - 2010/12/1

N2 - Purpose: Clinicians are overwhelmed by the sheer magnitude of new clinical information that is available on a daily basis. Despite the availability of information tools for finding this information and for updating clinical knowledge, no study has examined the quality of current information alerting services. Methods: We developed a 7-item checklist based on the principles of evidence-based medicine and assessed content validity with experts and face validity with practicing clinicians and clinician researchers. A list of clinical information updating tools (push tools) was generated in a systematic fashion and the checklist was used to rate the quality of these tools by two independent raters. Prior to rating all instruments, the raters were trained to achieve good agreement (>80%) by applying the checklist to two sets of three randomly selected tools. Descriptive statistics were used to describe the quality of the identified tools and inter-rater reliability was assessed using Intraclass Correlation (ICC). Results: Eighteen tools were identified using our systematic search. The average quality of these tools was 2.72 (range 0-7). Only two tools met all suggested criteria for quality. Inter-rater reliability for the 7-item checklist was .82 (ICC). Conclusions: We developed a checklist that can be used to reliably assess the quality of clinical information updating tools. We found many shortcomings in currently available clinical knowledge updating tools. Ideally, these tools will evolve in the direction of applying basic evidence-based medicine principles to new medical information in order to increase their usefulness to clinicians.

AB - Purpose: Clinicians are overwhelmed by the sheer magnitude of new clinical information that is available on a daily basis. Despite the availability of information tools for finding this information and for updating clinical knowledge, no study has examined the quality of current information alerting services. Methods: We developed a 7-item checklist based on the principles of evidence-based medicine and assessed content validity with experts and face validity with practicing clinicians and clinician researchers. A list of clinical information updating tools (push tools) was generated in a systematic fashion and the checklist was used to rate the quality of these tools by two independent raters. Prior to rating all instruments, the raters were trained to achieve good agreement (>80%) by applying the checklist to two sets of three randomly selected tools. Descriptive statistics were used to describe the quality of the identified tools and inter-rater reliability was assessed using Intraclass Correlation (ICC). Results: Eighteen tools were identified using our systematic search. The average quality of these tools was 2.72 (range 0-7). Only two tools met all suggested criteria for quality. Inter-rater reliability for the 7-item checklist was .82 (ICC). Conclusions: We developed a checklist that can be used to reliably assess the quality of clinical information updating tools. We found many shortcomings in currently available clinical knowledge updating tools. Ideally, these tools will evolve in the direction of applying basic evidence-based medicine principles to new medical information in order to increase their usefulness to clinicians.

UR - http://www.scopus.com/inward/record.url?scp=78650187268&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=78650187268&partnerID=8YFLogxK

U2 - 10.1016/j.ijmedinf.2010.08.004

DO - 10.1016/j.ijmedinf.2010.08.004

M3 - Article

C2 - 20951081

AN - SCOPUS:78650187268

VL - 79

SP - 824

EP - 831

JO - International Journal of Medical Informatics

JF - International Journal of Medical Informatics

SN - 1386-5056

IS - 12

ER -