Performance Measurement Network

Question: Inter-rater Reliability and Chart Re-asbstraction

Standard DSPM.3 EP4 states that the program monitors data reliability and validity. We currently do random checks of charts to verify that they are being abstracted correctly. Where would we find specific expectations for this internal validation process? How many charts should be reviewed and over what time frame? What indices are to be checked and who may do the validation?


All DSC certified programs should perform some data re-abstraction to check data quality and validity as required in standard DSPM.3 (EP4 – The program monitors data reliability and validity). Some programs have asked how best to meet this requirement. There are several methodologies that can be used to perform record re-abstraction (i.e., Data Element Aggreement Rate (DEAR), Category Assignment Agreement Rate (CAAR)); however, *no single methodology is required by The Joint Commission.*

A simple method that can be used is “5 medical records per quarter reviewed by a second abstractor (other than the individual performing the initial review if at all possible)”. The same numerator and denominator values reported by the first abstractor should be obtained by the second reviewer. For the STK measure set, chart re-abstraction should be performed according to the version of the specifications manual that was currently in effect during the reporting month,(e.g., Specifications Manual Version 3.1a is applicable to discharges April 1 through September 30, 2010). If you have a relationship with an ORYX core measure vendor, they may be able to offer further guidance.

Question Details
Focus area(s): Chart Abstracted Measure Specifications – Clinical, Related Manual - Data Quality Manual
Related documents: STK,

Copyright © by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding Performance Measurement Network? Send feedback