Updated on: June 22, 2012

MS-DRG Validation and Data Analytics: The Perfect Fit: Part II

By
Original story posted on: March 23, 2010
ED. NOTE: This is the conclusion to Carol Spencer’s article on MS-DRG Validation and Data Analytics. Part I appeared in RACMonitor.Enews on March 18, 2010 and is currently posted on the RACMonitor Web site.

spencerCThere are both internal and external triggers to determine or identify high-risk MS-DRGs, coding and medical necessity.

Internal triggers may be trends identified during internal or corporate MS-DRG, coding or medical necessity audits; new-hire education and training; or performance review audits. External triggers may be identified by examining Recovery Audit Contractor (RAC)-approved issues for MS-DRG and coding audits, RAC contractor demonstration results, Comprehensive Error Rate Testing (CERT) reports, Program for Evaluating Payment Patterns Electronic Report (PEPPER) Reports, Department of Health & Human Services’ Office of Inspector General (OIG) work plans and reports, private payer (such as Blue Cross and Blue Shield) audit findings and insurance denial reports. 

Understandably, hospital clinical staff can identify visual evidence of a risk of falling and promptly provide excellent service to avoid it. The clinical, Health Information Management (HIM), financial and information technology teams need to be just as acutely aware of the “visible” clues of improper payments through the analysis of claims data. As in step two above, this requires increased communication among the healthcare teams and an understanding of risk areas during “handoffs” to ensure that improper claims are not submitted.


Consider another case study: a patient admitted to the emergency department for abdominal pain is registered as an observation patient and then switched (handed off) to inpatient status the next day, as increased severity, illness or intensity of service were not present to justify an inpatient level of care. This two-day Length-Of-Stay (LOS) abdominal pain case (at-risk MS-DRG 392) is now handed off to a HIM staff member who codes the case and drops the inpatient claim for full MS-DRG reimbursement. The billing department then processes the claim to the payer with the potential for improper payment.

 

In the above example, how many handoffs occurred? How many times per day do handoffs go unchecked, thus potentially contributing to large sums in improper payments? Data analytics assist in controlling this process and preventing improper claims, which, in turn, result in increased data integrity, revenue integrity and quality data reporting.


After identifying a list of at-risk MS-DRGs, data mining analyzes the code assignments on the claims and assists in the selection of the highest-risk claims within the highest-risk MS-DRGs. Each MS-DRG has its own programmed trigger that flags potentially erroneous claims. It is at this level that cases are flagged for review. Cases may hit an edit for sequencing, such as sepsis as a secondary diagnosis when the condition was present on admission with a principal respiratory diagnosis including a ventilator procedure code. If this is the case, flag the case for review by a preferably concurrent “pre-bill auditor” or a retrospective auditor to evaluate the documentation, medical necessity and coding to ensure accuracy and the correct assignment of patient type.

Another example may be a symptom code as the principal diagnosis with a two-day LOS and a discharge home with low charges. This may be a medical necessity or coding issue. Edits flag cases with one major complication and co-morbidity (MCC) or one complication and co-morbidity (CC) for validation of documentation to code assignment. Flags also can be established for at-risk procedure codes such as excisional debridement or mechanical ventilation codes, and even may be established for newly hired coders or case managers “in training” to ensure all at-risk cases are being reviewed carefully until acceptable levels of accuracy are achieved. Flags for transfer MS-DRGs and discharge disposition codes at high risk, such as discharge disposition code 03 (skilled nursing facility) also would be useful.

The advantages of performing data analytics before an audit include fewer denials, external audits and duplication of work efforts. Performing analytics also produces reliable results when it comes to decision-making and quality data in addition to upholding revenue integrity, increasing patient satisfaction, safeguarding your hospital’s reputation, building knowledgeable workers and erasing departmental boundaries. It also helps build a commitment to ethics and compliance, and assists clinical, HIM and financial operations by adopting enabling technology.

Disadvantages may include requiring more advance preparation time and the need to maintain resources to purchase a product. Most people carry a cell phone (or two) and don’t mind paying for services and technology that are meaningful to them. Soon there will be a time when you wonder how you lived without data analytics to assist in identifying erroneous claims for correction prior to billing.



 

What steps should I follow to perform data mining?

To perform a non-probability sampling technique (essentially an audit with the purpose of selecting records with RAC-approved MS-DRGs) with a MS-DRG validation audit, take the following steps.

First, refer to Sampling: A Practical Guide for Quality Management in Home & Community-Based Waiver Programs (http://www.hsri.org/docs/QF_sampleguide.pdf). This guide, developed for CMS, provides a user-friendly, step-by-step approach to explaining sampling, identifying alternatives among sampling techniques and understanding how to use these techniques for specific purposes in a quality management strategy. The sampling method you select may differ from the example below.

1. Select your highest-risk MS-DRG.

2. Apply MS-DRG-specific coding, medical necessity and the claims analyzer to the claims-data file.

3. Perform a probe audit for each MS-DRG or other audit type (i.e., one-day LOS or transfer MS-DRGs representing multiple MS-DRGs), including payment data (UB-04 and remittance advice).

· If 25 or fewer claims are reported, review 100 percent.

· If 26 or more claims are reported, apply a statistically significant random selection technique. For example, you could use RAT-STATS, a package of statistical software tools designed for the OIG to assist in selecting random samples and evaluating the audit results, available at http://oig.hhs.gov/organization/OAS/ratstats.asp; or Research Randomizer, available at www.randomizer.org.

4. Validate key components of the medical record documentation (wearing your “audit hat”) against official coding guidelines and inpatient admission screening guidelines (see section below titled “What key components of claims data am I data mining?”)

5. Track changes to MS-DRGs.

6. Track changes to patient type (inpatient versus outpatient observation).

7. Track changes to patient status (discharge disposition) on transfer MS-DRGs.

8. Track payment impact (overpayments and underpayments) for steps 5, 6 and 7.

9. Report improper payment by account and track all rebills.

10. Calculate the paid claims error rate (PCER).

a. Compare your PCER to the CERT report’s PCERs (see http://www.cms.hhs.gov/CERT/CR/list.asp#TopOfPage).


b. Expect over-weighted results (significantly higher rates of improper payment or higher PCERs) due to the use of the non-probability sampling technique indicative of a “purposive” audit. A “purposive” audit is one that is performed for a purpose such as auditing records similar to those audited by RACs. It can provide you with a worst-case scenario in anticipation of the type of findings that the RACs may identify.


11. If the results confirm patterns for overpayment, perform an expanded probe audit (double the original probe number).


a.
If 50 or fewer claims are reported, review 100 percent.


b.
If 51 or more claims are reported, apply a randomizer technique to select up to 50 records.


12.
Report improper payment by account and track all rebills

13. Calculate the PCER.

14. If the expanded probe audit confirms overpayment patterns, perform a full audit of 100 percent of the records in the universe for that MS-DRG.

15. Report improper payment by account and track all rebills

16. Calculate the PCER.

17. Determine the root cause.

18. Implement corrective action.

19. Re-audit and measure for improvement.


 

How do I perform data mining with the probability-sample technique?

You should take the following steps to perform this technique, which is a stratified, or systematic, random sampling. As mentioned above, refer to CMS’s Sampling guide. Also, again, the sampling method you select may differ from the example below.

1. Select your highest-risk MS-DRG.

2. Apply a randomizer tool to the claims-data file.

3. Pull 30 records per MS-DRG or per type (for example, one-day LOS representing multiple MS-DRGs or transfer MS-DRGs representing multiple MS-DRGs).

4. Pull the randomly selected records and payment data (UB-04 and remittance advice).

5. Put on your audit hat to validate key components of the medical record documentation against official coding guidelines and inpatient admission screening guidelines (see section below titled “What key components of claims data am I data mining?”)

6. Track changes to MS-DRG.

7. Track changes to patient type (inpatient versus outpatient observation).

8. Track changes to patient status (discharge disposition) on transfer MS-DRGs.

9. Track payment impact (overpayments and underpayments) for steps 7, 8 and 9.

10. Report improper payment by account and track all rebills.

11. Calculate the PCER. Compare this to the CERT report’s PCERs by hospital provider and by MS-DRGs.

12. Extrapolate the payment impact to the entire population. If the improper payment amount triggers self-reporting, discuss the situation with your compliance officer or legal department.

13. Implement corrective action.

14. Re-audit and measure for improvement.

What key components of claims data am I data-mining?

  • At-risk MS-DRGs
  • Payer (Medicare)
  • Sex
  • Age
  • Discharge date
  • Length of stay (for one-day LOS, exclude cases with prior observation status)
  • Relative weights
  • Principal diagnosis and POA indicator
  • Secondary diagnoses: MCCs and POA indicator
  • Secondary diagnoses: CCs and POA indicator
  • Secondary diagnoses: Non MCCs/CCs and POA indicator
  • Procedures
  • Attending physician
  • Discharge disposition
  • Charges
  • Payment

 

Example 1: MCC/CC Outlier

PDX, SDX, SDX, SDX, SDX / Proc, Proc / LOS / Dis Disp / Chges / DRG / RW

§ 491.21Y, 486Y**, 285.9Y / 5 / 01 / $37,887 / 190 / 1.2076

§ 491.21Y, 486Y**, 250.00Y, 401.9Y / 3 / 01 / $14,785 / 190 / 1.2076

§ 493.22Y, 486Y**, 276.1Y* / 2 / 04 / $10,865 / 190 / 1.2076

§ 486Y, 491.21Y*, 584.9** / 5 / 01 / $29,6665 / 193 / 1.4378

§ 486Y, 493.22Y*, 428.21Y**, 428.0 / 38.93 / 3 / 01 / $28,773 / 193 / 1.4378

In the above example, we see a potential principal diagnosis sequencing outlier pattern emerging. The first three rows reflect MS-DRG 190 (chronic obstructive pulmonary disease with MCC) with a relative weight of 1.2076. The last two rows reflect simple pneumonia with MCC and a higher relative weight of 1.4378.

The pattern emerging is the sequencing of pneumonia as the principal diagnosis when there is a secondary diagnosis of a MCC. When there is no MCC, the COPD sequences first and the pneumonia, a MCC, is a secondary diagnosis – which optimizes the higher payment when the only MCC is pneumonia. This requires a record review to validate the Uniform Hospital Discharge Data Set (UHDDS) definition for principal diagnosis as the condition after the patient is admitted and to ensure that each case is correctly coded and sequenced and that the MS-DRG is billed and paid.

MedLearn’s data-analytics tool will print a list of all of your MS-DRGs with MCC/CC outliers, with which you can begin forming your audit strategy. The company’s expert auditors can partner with your organization by developing and maintaining an effective and continuing audit strategy to identify and mitigate future risk.


 

Example 2: Weak inpatient admission


Base MS-DRG

Base MS-DRG Description

IPPS Cases

ALOS

Average Charges

Average Payment

Average Cost

Case Mix Index

310-309-308

Cardiac arrhythmia and conduction disorders

45

2.44

$11,696

$7,085

$9,431

0.741

392-391

Esophagitis, gastroent and misc digest disorders

29

2.9

$6,703

$8,113

$5,816

0.7205

641-640

Nutritional and misc metabolic disorders

19

3.05

$7,845

$8,744

$6,772

0.7382

313

Chest pain

18

1.39

$4,532

$4,241

$3,825

0.5489

311

Angina pectoris

18

1.5

$4,646

$4,250

$4,465

0.5118

312

Syncope and collapse

14

1.43

$3,561

$4,100

$3,349

0.7197

690-689

Kidney and urinary tract infections

13

2.85

$4,693

$7,990

$4,928

0.8199

419-418-417

Laparoscopic cholecystectomy w/o c.d.e.

12

4.75

$24,012

$12,018

$16,819

1.6208

948-947

Signs & symptoms

12

2.25

$4,436

$5,924

$4,477

0.6542

 

 

Let’s take a look at the fictional Admit Me Hospital, a small, rural 80-bed hospital. This public data, as reported by MedPar, shows the hospital’s top 20 inpatient MS-DRGs. We see several MS-DRGs with a one-day LOS, the presence of which could trigger an external audit on the medical necessity of inpatient admissions. It is obvious that this hospital has not implemented controls or safeguards to keep these unsupported claims from going through to the payer.

 

 

Example 3: Discharge Disposition 03 (skilled nursing facility) on Transfer MS-DRGs


DRG

LOS

PDX

SDX

RW

Charges

Payment

177

5

507.0

415.19

2.0393

$39,628

$9,079

177

3

507.0

262

2.0393

$20,699

$6,765

177

4

507.0

345.90

2.0393

$26,825

$7,943

177

5

507.0

038.9

2.0393

$41,365

$9,079

177

5

482.1

518.84

2.0393

$24,669

$9,220

190

4

491.22

260

1.3030

$30,561

$7,780

190

3

491.21

584.9

1.3030

$22,525

$6,224

193

3

486

584.9

1.4327

$14,014

$6,337

193
3

486

518.84

1.4327

$29,418

$6,337

194

3

486

867.0

1.0056

$17,880

$5,595

194

3

486

427.31

1.0056

$34,965

$5,459

280

4

410.71

531.40

1.9404

$51,725

9,988

280

3

410.71

584.9

1.9404

$20,250

$7,990

280

4

410.71

486

1.9404

$39,482

$9,988

 

The above report reveals claims of hospital inpatients who were discharged and transferred to a skilled nursing facility (skilled care, discharge disposition 03) 24 hours earlier than the geometric mean length of stay (GMLOS). When a hospital discharges a patient who is assigned to a transfer MS-DRG and the discharge occurs during that time period, the hospital receives a reduced payment since the total payment is shared with the SNF.

 


Regarding the correct discharge-disposition assignment, two questions should be taken into consideration for lengths of stay like these. Specifically:

  • Does the documentation support medical necessity for a continued stay (and full payment)?
  • Does documentation clearly delineate that the patient is receiving “skilled” care at the nursing facility?


Documentation such as “discharged to nursing home” or “nursing facility for rehab” is ambiguous and does not clarify whether the discharged patient will need “skilled” services in the receiving facility.


The Bottom Line

Performing data mining (or data analytics) as a preliminary step before performing an audit sets the table for a more fulfilling feast. The purpose of performing audits is to identify issues so they can be corrected. Many times audits are performed in hopes that everything is in good shape and often data produced by them is unreliable, the results go unnoticed and little to no change occurs.

Therefore, digging deep for problems will bring to the surface many interesting tidbits that need to be evaluated and/or corrected. When the results of samplings provide data confidently displayed as an accurate reflection of the entire population and include a payment impact report (such as a PCER), administrative leaders jump on board for swift action.

For probability-based audits, consider using the November 2009 CERT-reported PCERs for hospitals, which is at or less than 1.6 percent. Non-probability based audits for PCER for hospitals is at or less than 6.6 percent. The formula for a PCER calculation is the total value of improper payment (underpayment plus overpayment) divided by the total net Medicare payments (subtracting deductibles and co-insurance). If audits are performed by MS-DRG with resultant PCERs, then compare to the MS-DRG PCERs published in the May 2008 CERT report at http://www.cms.hhs.gov/CERT/CR/list.asp#TopOfPage.

About the Author

Carol Spencer, RHIA, CCS, CHDA is a senior healthcare consultant with Medical Learning, Inc. (MedLearn®) in St. Paul, Minn. MedLearn is a nationally recognized expert in healthcare compliance and reimbursement. Founded in 1991, MedLearn delivers actionable answers that equip healthcare organizations with coding, chargemaster, reimbursement management and RAC solutions.


Contact the Author

cspencer@medlearn.com

 

This email address is being protected from spambots. You need JavaScript enabled to view it.