The presence of radiological features on chest radiographs: How well do clinicians agree?
Edwards M., Lawson Z., Morris S., Evans A., Harrison S., Isaac R., Crocker J., Powell C.
Aim: To compare levels of agreement amongst paediatric clinicians with those amongst consultant paediatric radiologists when interpreting chest radiographs (CXRs). Materials and methods: Four paediatric radiologists used picture archiving and communication system (PACS) workstations to evaluate the presence of five radiological features of infection, independently in each of 30 CXRs. The radiographs were obtained over 1 year (2008) from children with fever and signs of respiratory distress, aged 6 months to <16 years. The same CXRs were interpreted a second time by the paediatric radiologists and by 21 clinicians with varying experience levels, using the Web 1000 viewing system and a projector. Intra- and interobserver agreement within groups, split by grade and specialty, were analysed using free-marginal multi-rater kappa. Results: Normal CXRs were identified consistently amongst all 25 participants. The four paediatric radiologists showed high levels of intraobserver agreement between methods (kappa scores between 0.53 and 1.00) and interobserver agreement for each method (kappa scores between 0.67 and 0.96 for PACS assessment). The 21 clinicians showed varying levels of agreement from 0.21 to 0.89. Conclusion: Paediatric radiologists showed high levels of agreement for all features. In general, the clinicians had lower levels of agreement than the radiologists. This study highlights the need for improved training in interpreting CXRs for clinicians and the timely reporting of CXRs by radiologists to allow appropriate patient management. © 2012 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.