Kah Long Aw
Queen???s University Belfast, UK
Scientific Tracks Abstracts: Gen Med
To determine the interrater variability for TIA diagnostic agreement among expert clinicians (neurologists/ stroke physicians), administrative data, and non-specialists. We performed a meta-analysis of studies from January 1984 to January 2019 using MEDLINE, EMBASE, and PubMed. Two reviewers independently screened for eligible studies and extracted interrater variability measurements using Cohen’s kappa scores to assess diagnostic agreement. Nineteen original studies consisting of 19,421 patients were included. Expert clinicians demonstrate good agreement for TIA diagnosis (κ = 0.71, 95% confidence interval [CI] = 0.62– 0.81). Interrater variability between clinicians’ TIA diagnosis and administrative data also demonstrated good agreement (κ = 0.68, 95% CI = 0.62–0.74). There was moderate agreement (κ = 0.41, 95% CI = 0.22–0.61) between referring clinicians and clinicians at TIA clinics receiving the referrals. Sixty percent of 748 patient referrals to TIA clinics were TIA mimics. Overall agreement between expert clinicians was good for TIA diagnosis, although variation still existed for a sizeable proportion of cases. Diagnostic agreement for TIA decreased among nonspecialists. The substantial number of patients being referred to TIA clinics with other (often neurologic) diagnoses was large, suggesting that clinicians, who are proficient in man- aging TIAs and their mimics, should run TIA clinics.