Comparison of onsite versus online chart reviews as part of the American college of radiation oncology accreditation program

Jaroslaw T. Hepel, Dwight E. Heron, Arno J. Mundt, Catheryn Yashar, Steven Feigenberg, Gordon Koltis, William F. Regine, Dheerendra Prasad, Shilpen Patel, Navesh Sharma, Mary Hebert, Norman Wallis, Michael Kuettel

Research output: Contribution to journalReview article

Abstract

Purpose Accreditation based on peer review of professional standards of care is essential in ensuring quality and safety in administration of radiation therapy. Traditionally, medical chart reviews have been performed by a physical onsite visit. The American College of Radiation Oncology Accreditation Program has remodeled its process whereby electronic charts are reviewed remotely. Methods Twenty-eight radiation oncology practices undergoing accreditation had three charts per practice undergo both onsite and online review. Onsite review was performed by a single reviewer for each practice. Online review consisted of one or more disease site-specific reviewers for each practice. Onsite and online reviews were blinded and scored on a 100-point scale on the basis of 20 categories. A score of less than 75 was failing, and a score of 75 to 79 was marginal. Any failed charts underwent rereview by a disease site team leader. Results Eighty-four charts underwent both onsite and online review. The mean scores were 86.0 and 86.9 points for charts reviewed onsite and online, respectively. Comparison of onsite and online reviews revealed no statistical difference in chart scores (P = .43). Of charts reviewed, 21% had a marginal (n = 8) or failing (n = 10) score. There was no difference in failing charts (P = .48) or combined marginal and failing charts (P = .13) comparing onsite and online reviews. Conclusion The American College of Radiation Oncology accreditation process of online chart review results in comparable review scores andrate of failing scorescompared with traditional onsite review. However, the modern online process holds less potential for bias by using multiple reviewers per practice and allows for greater oversight via disease site team leader rereview.

Original languageEnglish (US)
Pages (from-to)e516-e520
JournalJournal of oncology practice
Volume13
Issue number5
DOIs
StatePublished - May 1 2017

Fingerprint

Radiation Oncology
Accreditation
Peer Review
Standard of Care
Radiotherapy
Safety

All Science Journal Classification (ASJC) codes

  • Oncology
  • Oncology(nursing)
  • Health Policy

Cite this

Hepel, J. T., Heron, D. E., Mundt, A. J., Yashar, C., Feigenberg, S., Koltis, G., ... Kuettel, M. (2017). Comparison of onsite versus online chart reviews as part of the American college of radiation oncology accreditation program. Journal of oncology practice, 13(5), e516-e520. https://doi.org/10.1200/JOP.2016.015230
Hepel, Jaroslaw T. ; Heron, Dwight E. ; Mundt, Arno J. ; Yashar, Catheryn ; Feigenberg, Steven ; Koltis, Gordon ; Regine, William F. ; Prasad, Dheerendra ; Patel, Shilpen ; Sharma, Navesh ; Hebert, Mary ; Wallis, Norman ; Kuettel, Michael. / Comparison of onsite versus online chart reviews as part of the American college of radiation oncology accreditation program. In: Journal of oncology practice. 2017 ; Vol. 13, No. 5. pp. e516-e520.
@article{f59410e713a24a289930089055521bb5,
title = "Comparison of onsite versus online chart reviews as part of the American college of radiation oncology accreditation program",
abstract = "Purpose Accreditation based on peer review of professional standards of care is essential in ensuring quality and safety in administration of radiation therapy. Traditionally, medical chart reviews have been performed by a physical onsite visit. The American College of Radiation Oncology Accreditation Program has remodeled its process whereby electronic charts are reviewed remotely. Methods Twenty-eight radiation oncology practices undergoing accreditation had three charts per practice undergo both onsite and online review. Onsite review was performed by a single reviewer for each practice. Online review consisted of one or more disease site-specific reviewers for each practice. Onsite and online reviews were blinded and scored on a 100-point scale on the basis of 20 categories. A score of less than 75 was failing, and a score of 75 to 79 was marginal. Any failed charts underwent rereview by a disease site team leader. Results Eighty-four charts underwent both onsite and online review. The mean scores were 86.0 and 86.9 points for charts reviewed onsite and online, respectively. Comparison of onsite and online reviews revealed no statistical difference in chart scores (P = .43). Of charts reviewed, 21{\%} had a marginal (n = 8) or failing (n = 10) score. There was no difference in failing charts (P = .48) or combined marginal and failing charts (P = .13) comparing onsite and online reviews. Conclusion The American College of Radiation Oncology accreditation process of online chart review results in comparable review scores andrate of failing scorescompared with traditional onsite review. However, the modern online process holds less potential for bias by using multiple reviewers per practice and allows for greater oversight via disease site team leader rereview.",
author = "Hepel, {Jaroslaw T.} and Heron, {Dwight E.} and Mundt, {Arno J.} and Catheryn Yashar and Steven Feigenberg and Gordon Koltis and Regine, {William F.} and Dheerendra Prasad and Shilpen Patel and Navesh Sharma and Mary Hebert and Norman Wallis and Michael Kuettel",
year = "2017",
month = "5",
day = "1",
doi = "10.1200/JOP.2016.015230",
language = "English (US)",
volume = "13",
pages = "e516--e520",
journal = "Journal of Oncology Practice",
issn = "1554-7477",
publisher = "American Society of Clinical Oncology",
number = "5",

}

Hepel, JT, Heron, DE, Mundt, AJ, Yashar, C, Feigenberg, S, Koltis, G, Regine, WF, Prasad, D, Patel, S, Sharma, N, Hebert, M, Wallis, N & Kuettel, M 2017, 'Comparison of onsite versus online chart reviews as part of the American college of radiation oncology accreditation program', Journal of oncology practice, vol. 13, no. 5, pp. e516-e520. https://doi.org/10.1200/JOP.2016.015230

Comparison of onsite versus online chart reviews as part of the American college of radiation oncology accreditation program. / Hepel, Jaroslaw T.; Heron, Dwight E.; Mundt, Arno J.; Yashar, Catheryn; Feigenberg, Steven; Koltis, Gordon; Regine, William F.; Prasad, Dheerendra; Patel, Shilpen; Sharma, Navesh; Hebert, Mary; Wallis, Norman; Kuettel, Michael.

In: Journal of oncology practice, Vol. 13, No. 5, 01.05.2017, p. e516-e520.

Research output: Contribution to journalReview article

TY - JOUR

T1 - Comparison of onsite versus online chart reviews as part of the American college of radiation oncology accreditation program

AU - Hepel, Jaroslaw T.

AU - Heron, Dwight E.

AU - Mundt, Arno J.

AU - Yashar, Catheryn

AU - Feigenberg, Steven

AU - Koltis, Gordon

AU - Regine, William F.

AU - Prasad, Dheerendra

AU - Patel, Shilpen

AU - Sharma, Navesh

AU - Hebert, Mary

AU - Wallis, Norman

AU - Kuettel, Michael

PY - 2017/5/1

Y1 - 2017/5/1

N2 - Purpose Accreditation based on peer review of professional standards of care is essential in ensuring quality and safety in administration of radiation therapy. Traditionally, medical chart reviews have been performed by a physical onsite visit. The American College of Radiation Oncology Accreditation Program has remodeled its process whereby electronic charts are reviewed remotely. Methods Twenty-eight radiation oncology practices undergoing accreditation had three charts per practice undergo both onsite and online review. Onsite review was performed by a single reviewer for each practice. Online review consisted of one or more disease site-specific reviewers for each practice. Onsite and online reviews were blinded and scored on a 100-point scale on the basis of 20 categories. A score of less than 75 was failing, and a score of 75 to 79 was marginal. Any failed charts underwent rereview by a disease site team leader. Results Eighty-four charts underwent both onsite and online review. The mean scores were 86.0 and 86.9 points for charts reviewed onsite and online, respectively. Comparison of onsite and online reviews revealed no statistical difference in chart scores (P = .43). Of charts reviewed, 21% had a marginal (n = 8) or failing (n = 10) score. There was no difference in failing charts (P = .48) or combined marginal and failing charts (P = .13) comparing onsite and online reviews. Conclusion The American College of Radiation Oncology accreditation process of online chart review results in comparable review scores andrate of failing scorescompared with traditional onsite review. However, the modern online process holds less potential for bias by using multiple reviewers per practice and allows for greater oversight via disease site team leader rereview.

AB - Purpose Accreditation based on peer review of professional standards of care is essential in ensuring quality and safety in administration of radiation therapy. Traditionally, medical chart reviews have been performed by a physical onsite visit. The American College of Radiation Oncology Accreditation Program has remodeled its process whereby electronic charts are reviewed remotely. Methods Twenty-eight radiation oncology practices undergoing accreditation had three charts per practice undergo both onsite and online review. Onsite review was performed by a single reviewer for each practice. Online review consisted of one or more disease site-specific reviewers for each practice. Onsite and online reviews were blinded and scored on a 100-point scale on the basis of 20 categories. A score of less than 75 was failing, and a score of 75 to 79 was marginal. Any failed charts underwent rereview by a disease site team leader. Results Eighty-four charts underwent both onsite and online review. The mean scores were 86.0 and 86.9 points for charts reviewed onsite and online, respectively. Comparison of onsite and online reviews revealed no statistical difference in chart scores (P = .43). Of charts reviewed, 21% had a marginal (n = 8) or failing (n = 10) score. There was no difference in failing charts (P = .48) or combined marginal and failing charts (P = .13) comparing onsite and online reviews. Conclusion The American College of Radiation Oncology accreditation process of online chart review results in comparable review scores andrate of failing scorescompared with traditional onsite review. However, the modern online process holds less potential for bias by using multiple reviewers per practice and allows for greater oversight via disease site team leader rereview.

UR - http://www.scopus.com/inward/record.url?scp=85026823734&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85026823734&partnerID=8YFLogxK

U2 - 10.1200/JOP.2016.015230

DO - 10.1200/JOP.2016.015230

M3 - Review article

C2 - 28301278

AN - SCOPUS:85026823734

VL - 13

SP - e516-e520

JO - Journal of Oncology Practice

JF - Journal of Oncology Practice

SN - 1554-7477

IS - 5

ER -