Updated on: November 20, 2018

MS-DRG Validation and Data Analytics: The Perfect Fit: Part II

Original story posted on: March 22, 2010

ED. NOTE: This is the conclusion to Carol Spencer’s article on MS-DRG Validation and Data Analytics. Part I appeared in RACMonitor.Enews on March 18, 2010 and is currently posted on the RACMonitor Web site.



spencerCThere are both internal and external triggers to determine or identify high-risk MS-DRGs, coding and medical necessity.

Internal triggers may be trends identified during internal or corporate MS-DRG, coding or medical necessity audits; new-hire education and training; or performance review audits. External triggers may be identified by examining Recovery Audit Contractor (RAC)-approved issues for MS-DRG and coding audits, RAC contractor demonstration results, Comprehensive Error Rate Testing (CERT) reports, Program for Evaluating Payment Patterns Electronic Report (PEPPER) Reports, Department of Health & Human Services’ Office of Inspector General (OIG) work plans and reports, private payer (such as Blue Cross and Blue Shield) audit findings and insurance denial reports. 


Understandably, hospital clinical staff can identify visual evidence of a risk of falling and promptly provide excellent service to avoid it. The clinical, Health Information Management (HIM), financial and information technology teams need to be just as acutely aware of the “visible” clues of improper payments through the analysis of claims data. As in step two above, this requires increased communication among the healthcare teams and an understanding of risk areas during “handoffs” to ensure that improper claims are not submitted.

Consider another case study: a patient admitted to the emergency department for abdominal pain is registered as an observation patient and then switched (handed off) to inpatient status the next day, as increased severity, illness or intensity of service were not present to justify an inpatient level of care. This two-day Length-Of-Stay (LOS) abdominal pain case (at-risk MS-DRG 392) is now handed off to a HIM staff member who codes the case and drops the inpatient claim for full MS-DRG reimbursement. The billing department then processes the claim to the payer with the potential for improper payment.



In the above example, how many handoffs occurred? How many times per day do handoffs go unchecked, thus potentially contributing to large sums in improper payments? Data analytics assist in controlling this process and preventing improper claims, which, in turn, result in increased data integrity, revenue integrity and quality data reporting.


After identifying a list of at-risk MS-DRGs, data mining analyzes the code assignments on the claims and assists in the selection of the highest-risk claims within the highest-risk MS-DRGs. Each MS-DRG has its own programmed trigger that flags potentially erroneous claims. It is at this level that cases are flagged for review. Cases may hit an edit for sequencing, such as sepsis as a secondary diagnosis when the condition was present on admission with a principal respiratory diagnosis including a ventilator procedure code. If this is the case, flag the case for review by a preferably concurrent “pre-bill auditor” or a retrospective auditor to evaluate the documentation, medical necessity and coding to ensure accuracy and the correct assignment of patient type.


Another example may be a symptom code as the principal diagnosis with a two-day LOS and a discharge home with low charges. This may be a medical necessity or coding issue. Edits flag cases with one major complication and co-morbidity (MCC) or one complication and co-morbidity (CC) for validation of documentation to code assignment. Flags also can be established for at-risk procedure codes such as excisional debridement or mechanical ventilation codes, and even may be established for newly hired coders or case managers “in training” to ensure all at-risk cases are being reviewed carefully until acceptable levels of accuracy are achieved. Flags for transfer MS-DRGs and discharge disposition codes at high risk, such as discharge disposition code 03 (skilled nursing facility) also would be useful.


The advantages of performing data analytics before an audit include fewer denials, external audits and duplication of work efforts. Performing analytics also produces reliable results when it comes to decision-making and quality data in addition to upholding revenue integrity, increasing patient satisfaction, safeguarding your hospital’s reputation, building knowledgeable workers and erasing departmental boundaries. It also helps build a commitment to ethics and compliance, and assists clinical, HIM and financial operations by adopting enabling technology.


Disadvantages may include requiring more advance preparation time and the need to maintain resources to purchase a product. Most people carry a cell phone (or two) and don’t mind paying for services and technology that are meaningful to them. Soon there will be a time when you wonder how you lived without data analytics to assist in identifying erroneous claims for correction prior to billing.




What steps should I follow to perform data mining?


To perform a non-probability sampling technique (essentially an audit with the purpose of selecting records with RAC-approved MS-DRGs) with a MS-DRG validation audit, take the following steps.


First, refer to Sampling: A Practical Guide for Quality Management in Home & Community-Based Waiver Programs (http://www.hsri.org/docs/QF_sampleguide.pdf). This guide, developed for CMS, provides a user-friendly, step-by-step approach to explaining sampling, identifying alternatives among sampling techniques and understanding how to use these techniques for specific purposes in a quality management strategy. The sampling method you select may differ from the example below.


1. Select your highest-risk MS-DRG.

2. Apply MS-DRG-specific coding, medical necessity and the claims analyzer to the claims-data file.

3. Perform a probe audit for each MS-DRG or other audit type (i.e., one-day LOS or transfer MS-DRGs representing multiple MS-DRGs), including payment data (UB-04 and remittance advice).

· If 25 or fewer claims are reported, review 100 percent.

· If 26 or more claims are reported, apply a statistically significant random selection technique. For example, you could use RAT-STATS, a package of statistical software tools designed for the OIG to assist in selecting random samples and evaluating the audit results, available at http://oig.hhs.gov/organization/OAS/ratstats.asp; or Research Randomizer, available at www.randomizer.org.

4. Validate key components of the medical record documentation (wearing your “audit hat”) against official coding guidelines and inpatient admission screening guidelines (see section below titled “What key components of claims data am I data mining?”)

5. Track changes to MS-DRGs.

6. Track changes to patient type (inpatient versus outpatient observation).

7. Track changes to patient status (discharge disposition) on transfer MS-DRGs.

8. Track payment impact (overpayments and underpayments) for steps 5, 6 and 7.

9. Report improper payment by account and track all rebills.

10. Calculate the paid claims error rate (PCER).

 a. Compare your PCER to the CERT report’s PCERs (see http://www.cms.hhs.gov/CERT/CR/list.asp#TopOfPage).

b. Expect over-weighted results (significantly higher rates of improper payment or higher PCERs) due to the use of the non-probability sampling technique  indicative of a “purposive” audit. A “purposive” audit is one that is performed for a purpose such as auditing records similar to those audited by RACs. It can  provide you with a worst-case scenario in anticipation of the type of findings that the RACs may identify.

11. If the results confirm patterns for overpayment, perform an expanded probe audit (double the original probe number).

a. If 50 or fewer claims are reported, review 100 percent.

b. If 51 or more claims are reported, apply a randomizer technique to select up to 50 records.

12. Report improper payment by account and track all rebills

13. Calculate the PCER.

14. If the expanded probe audit confirms overpayment patterns, perform a full audit of 100 percent of the records in the universe for that MS-DRG.

15. Report improper payment by account and track all rebills

16. Calculate the PCER.

17. Determine the root cause.

18. Implement corrective action.

19. Re-audit and measure for improvement.



How do I perform data mining with the probability-sample technique?


You should take the following steps to perform this technique, which is a stratified, or systematic, random sampling. As mentioned above, refer to CMS’s Sampling guide. Also, again, the sampling method you select may differ from the example below.


1. Select your highest-risk MS-DRG.

2. Apply a randomizer tool to the claims-data file.

3. Pull 30 records per MS-DRG or per type (for example, one-day LOS representing multiple MS-DRGs or transfer MS-DRGs representing multiple MS-DRGs).

4. Pull the randomly selected records and payment data (UB-04 and remittance advice).

5. Put on your audit hat to validate key components of the medical record documentation against official coding guidelines and inpatient admission screening guidelines (see section below titled “What key components of claims data am I data mining?”)

6. Track changes to MS-DRG.

7. Track changes to patient type (inpatient versus outpatient observation).

8. Track changes to patient status (discharge disposition) on transfer MS-DRGs.

9. Track payment impact (overpayments and underpayments) for steps 7, 8 and 9.

10. Report improper payment by account and track all rebills.

11. Calculate the PCER. Compare this to the CERT report’s PCERs by hospital provider and by MS-DRGs.

12. Extrapolate the payment impact to the entire population. If the improper payment amount triggers self-reporting, discuss the situation with your compliance officer or legal department.

13. Implement corrective action.

14. Re-audit and measure for improvement.


What key components of claims data am I data-mining?

  • At-risk MS-DRGs
  • Payer (Medicare)
  • Sex
  • Age
  • Discharge date
  • Length of stay (for one-day LOS, exclude cases with prior observation status)
  • Relative weights
  • Principal diagnosis and POA indicator
  • Secondary diagnoses: MCCs and POA indicator
  • Secondary diagnoses: CCs and POA indicator
  • Secondary diagnoses: Non MCCs/CCs and POA indicator
  • Procedures
  • Attending physician
  • Discharge disposition
  • Charges
  • Payment




Example 1: MCC/CC Outlier

PDX, SDX, SDX, SDX, SDX / Proc, Proc / LOS / Dis Disp / Chges / DRG / RW

§ 491.21Y, 486Y**, 285.9Y / 5 / 01 / $37,887 / 190 / 1.2076

§ 491.21Y, 486Y**, 250.00Y, 401.9Y / 3 / 01 / $14,785 / 190 / 1.2076

§ 493.22Y, 486Y**, 276.1Y* / 2 / 04 / $10,865 / 190 / 1.2076

§ 486Y, 491.21Y*, 584.9** / 5 / 01 / $29,6665 / 193 / 1.4378

§ 486Y, 493.22Y*, 428.21Y**, 428.0 / 38.93 / 3 / 01 / $28,773 / 193 / 1.4378


In the above example, we see a potential principal diagnosis sequencing outlier pattern emerging. The first three rows reflect MS-DRG 190 (chronic obstructive pulmonary disease with MCC) with a relative weight of 1.2076. The last two rows reflect simple pneumonia with MCC and a higher relative weight of 1.4378.


The pattern emerging is the sequencing of pneumonia as the principal diagnosis when there is a secondary diagnosis of a MCC. When there is no MCC, the COPD sequences first and the pneumonia, a MCC, is a secondary diagnosis – which optimizes the higher payment when the only MCC is pneumonia. This requires a record review to validate the Uniform Hospital Discharge Data Set (UHDDS) definition for principal diagnosis as the condition after the patient is admitted and to ensure that each case is correctly coded and sequenced and that the MS-DRG is billed and paid.


MedLearn’s data-analytics tool will print a list of all of your MS-DRGs with MCC/CC outliers, with which you can begin forming your audit strategy. The company’s expert auditors can partner with your organization by developing and maintaining an effective and continuing audit strategy to identify and mitigate future risk.


Example 2: Weak inpatient admission



Base MS-DRG Description

IPPS Cases


Average Charges

Average Payment

Average Cost

Case Mix Index


Cardiac arrhythmia and conduction disorders








Esophagitis, gastroent and misc digest disorders








Nutritional and misc metabolic disorders








Chest pain








Angina pectoris








Syncope and collapse








Kidney and urinary tract infections








Laparoscopic cholecystectomy w/o c.d.e.








Signs & symptoms









Let’s take a look at the fictional Admit Me Hospital, a small, rural 80-bed hospital. This public data, as reported by MedPar, shows the hospital’s top 20 inpatient MS-DRGs. We see several MS-DRGs with a one-day LOS, the presence of which could trigger an external audit on the medical necessity of inpatient admissions. It is obvious that this hospital has not implemented controls or safeguards to keep these unsupported claims from going through to the payer.




Example 3: Discharge Disposition 03 (skilled nursing facility) on Transfer MS-DRGs










































































































The above report reveals claims of hospital inpatients who were discharged and transferred to a skilled nursing facility (skilled care, discharge disposition 03) 24 hours earlier than the geometric mean length of stay (GMLOS). When a hospital discharges a patient who is assigned to a transfer MS-DRG and the discharge occurs during that time period, the hospital receives a reduced payment since the total payment is shared with the SNF.



Regarding the correct discharge-disposition assignment, two questions should be taken into consideration for lengths of stay like these. Specifically:

  • Does the documentation support medical necessity for a continued stay (and full payment)?
  • Does documentation clearly delineate that the patient is receiving “skilled” care at the nursing facility?

Documentation such as “discharged to nursing home” or “nursing facility for rehab” is ambiguous and does not clarify whether the discharged patient will need “skilled” services in the receiving facility.



The Bottom Line


Performing data mining (or data analytics) as a preliminary step before performing an audit sets the table for a more fulfilling feast. The purpose of performing audits is to identify issues so they can be corrected. Many times audits are performed in hopes that everything is in good shape and often data produced by them is unreliable, the results go unnoticed and little to no change occurs.


Therefore, digging deep for problems will bring to the surface many interesting tidbits that need to be evaluated and/or corrected. When the results of samplings provide data confidently displayed as an accurate reflection of the entire population and include a payment impact report (such as a PCER), administrative leaders jump on board for swift action.


For probability-based audits, consider using the November 2009 CERT-reported PCERs for hospitals, which is at or less than 1.6 percent. Non-probability based audits for PCER for hospitals is at or less than 6.6 percent. The formula for a PCER calculation is the total value of improper payment (underpayment plus overpayment) divided by the total net Medicare payments (subtracting deductibles and co-insurance). If audits are performed by MS-DRG with resultant PCERs, then compare to the MS-DRG PCERs published in the May 2008 CERT report at http://www.cms.hhs.gov/CERT/CR/list.asp#TopOfPage.


About the Author

Carol Spencer, RHIA, CCS, CHDA is a senior healthcare consultant with Medical Learning, Inc. (MedLearn®) in St. Paul, Minn. MedLearn is a nationally recognized expert in healthcare compliance and reimbursement. Founded in 1991, MedLearn delivers actionable answers that equip healthcare organizations with coding, chargemaster, reimbursement management and RAC solutions.

Contact the Author



This email address is being protected from spambots. You need JavaScript enabled to view it.