Skip to main content

Inter-rater agreement in the triage of calls to a paediatric interhospital transfer service

Introduction

As a result of centralisation of PICU services in the United Kingdom, transfer of critically ill children has become common over the past decade. It is not uncommon to receive multiple retrieval requests simultaneously, thus a tool to prioritise the urgency of this would be beneficial. Our aim was to develop such a tool and assess its inter-rater repeatability.

Methods

The tool was developed by three senior medical staff of the South Thames Retrieval Service (operating from the PICU at Evelina Children's Hospital, London with 1,000 calls per annum from 24 district general hospitals, resulting in 600 retrievals). A modified Delphi method was used, which comprised an iterative process including a literature review, knowledge of the underlying conditions and a review of retrievals performed by the service over the previous 7 years (n = 3,669). Inter-rater agreement was assessed using the weighted kappa statistic, and was measured between various pairings of junior and senior medical staff (n = 28 combinations) on 50 retrieval episodes.

Results

The final tool comprised five categories (three levels of severity each) allowing for a range of scores from 0 to 15 (Figure 1). Three levels of urgency were defined: semi-urgent (score <8), urgent (score 8–10), immediate (score >10). Overall the tool showed a good to very good strength of inter-rater agreement (kappa scores ranging from 0.65 to 0.88; Figure 2). There were no obvious differences between levels of staff seniority.

Figure 1
figure1

abstract

Figure 2
figure2

abstract

Conclusion

The score showed acceptable agreement, fullfilling the first step of validation.

Author information

Affiliations

Authors

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Riphagen, S., Alasnag, M., Hanna, S. et al. Inter-rater agreement in the triage of calls to a paediatric interhospital transfer service. Crit Care 11, P440 (2007). https://0-doi-org.brum.beds.ac.uk/10.1186/cc5600

Download citation

Keywords

  • Kappa Statistic
  • Weighted Kappa
  • Delphi Method
  • Staff Seniority
  • Acceptable Agreement