Excerpted with Permission from For The Record
January 2018, Vol. 30 No. 1 P. 14
by Lisa Eramo, MA
According to HIM industry experts, health care organizations must be on the lookout for the following pitfalls when evaluating their clinical documentation improvement (CDI) programs.
1. Failure to Look Beyond the Data
Data tell a story, but do they tell the entire story? Not necessarily, which is why organizations need to examine anecdotal data as well, says Tiffany McCarthy, RHIT, manager of HIM solutions at GeBBS Healthcare Solutions. For example, ask physicians, CDI specialists, and coders to evaluate whether everyone works collaboratively. If not, what are the challenges and hurdles?
Fran Jurcak, MSN, RN, CCDS, vice president of clinical innovation at Iodine Software, agrees that interaction is key. “There’s still that level of engagement at the physician level that needs to occur. There still needs to be communication between CDI specialists and coders,” she says. “Helping each other to understand the clinical and coding issues related to appropriate documentation in a medical record is key to success for the organization.”
2. Assuming Increased Revenue Equates to Success
Although increased revenue may be the goal of a CDI program, organizations should consider how they compare with other facilities, says Amber Sterling, RN, BSN, CCDS, director of CDI services at TrustHCS. What do PEPPER and MedPAR data reveal? How does the organization compare with state, regional, and national averages? Does the analysis indicate any data anomalies that could raise a red flag with auditors?
3. Failure to Formalize Metrics in a Policy
Jurcak says a program policy should include the following:
- specific metrics;
- the definition of each metric;
- clarification of the relationship between these metrics and the CDI program’s mission; and
- benchmarking sources (eg, professional associations, MedPAR data, and internal averages).
Creating a policy ensures all CDI staff report data consistently. It also helps organizations compare themselves with other facilities, Sterling says. For example, if the organization calculates query volume as the number of queries per case, it doesn’t make sense to compare this number with a facility that uses the number of cases queried.
4. Misaligning Metrics and Mission
The CDI mission refers to the program’s overarching goal. For example, is it to increase reimbursement? Improve quality? Both?
Once defined, the mission can help organizations choose appropriate metrics to measure performance. Financial impact, for example, becomes less important when the mission is to improve quality, Jurcak says. “If you truly understand the mission, you can better hold your staff accountable,” she notes.
5. Reviewing Metrics Too Frequently — or Infrequently
While daily reviews don’t provide insight into larger trends, Jurcak says analyzing metrics monthly allows for quick adjustments.
However, don’t expect immediate improvements after an intervention. “Instant change isn’t going to necessarily result in instant benefits or improvements in the metrics next month. You have to watch it over time,” Jurcak says.
Certified Coding Specialist (CCS) Exam Prep: CCS for CDI
This highly-acclaimed program prepares candidates to sit for AHIMA’s 2018 Certified Coding Specialist (CCS) credential exam. To learn more and to join the waitlist for the next session, click here