Purpose: Clinicians are overwhelmed by the sheer magnitude of new clinical information that is available on a daily basis. Despite the availability of information tools for finding this information and for updating clinical knowledge, no study has examined the quality of current information alerting services. Methods: We developed a 7-item checklist based on the principles of evidence-based medicine and assessed content validity with experts and face validity with practicing clinicians and clinician researchers. A list of clinical information updating tools (push tools) was generated in a systematic fashion and the checklist was used to rate the quality of these tools by two independent raters. Prior to rating all instruments, the raters were trained to achieve good agreement (>80%) by applying the checklist to two sets of three randomly selected tools. Descriptive statistics were used to describe the quality of the identified tools and inter-rater reliability was assessed using Intraclass Correlation (ICC). Results: Eighteen tools were identified using our systematic search. The average quality of these tools was 2.72 (range 0-7). Only two tools met all suggested criteria for quality. Inter-rater reliability for the 7-item checklist was .82 (ICC). Conclusions: We developed a checklist that can be used to reliably assess the quality of clinical information updating tools. We found many shortcomings in currently available clinical knowledge updating tools. Ideally, these tools will evolve in the direction of applying basic evidence-based medicine principles to new medical information in order to increase their usefulness to clinicians.
All Science Journal Classification (ASJC) codes
- Health Informatics