Int J Audiol. 2015 Jun 22:1-7. [Epub ahead of print]
Tuesday, June 23, 2015
New publication by Kairn Kelley
Congratulations to PhD candidate Kairn Kelley MS whose research just appeared in the International Journal of Audiology.
To determine whether rater agreement is randomly distributed or varies importantly with test-taker characteristics, test words, or rater experience with the dichotic words test (DWT).
DWT was administered to 34 children in 1st-4th grade and responses scored by two raters. The proportion of rater agreement was calculated for each child and for each word. Correlates of inter-rater agreement were explored.
Two raters judged 6686 total responses from 34 children.
Overall agreement between the two raters was 0.97. Test-taker scores ranged from 35%-91% (mean = 81%). Agreement was associated with score but not with test-taker age or sex. Test words spanned the full range of difficulty (pass proportion 0.06-1.00). Rater agreement was not randomly distributed among the words. Inter-rater agreement for test words ranged from 0.82-1.00 and was associated with pass proportion (Spearman's ρ = 0.28; p < 0.0001). However, there were words at all pass proportions with perfect or near-perfect agreement. Rater agreement improved from 0.94 on the first day of data collection to 0.98 on the fifth day (p = 0.026).
Inter-rater reliability should be considered along with test item difficulty when developing speech audiometry materials, scoring protocols, and rater training.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.