Tools to reduce system and submission errors
Part two of a four-part series of CMS Best Practices Implementation
Centauri Health Solutions’ Director of Product Strategy – Dawn Carter
We continue to add to your Risk Adjustment renovation toolbox, to ensure data submission program success.
This is the third in a four-part series where I continue to detail a dozen CMS best practices and provide actionable strategies. So far, I have shared a variety of tools that align internal and CMS edits to build stronger provider partnerships.
Hopefully, you caught my recent webinar, which provided an overview of those first six tools.
This article focuses on tools to help you reduce errors by testing and monitoring your systems and submissions.
CMS Best Practice Tool#7: Paint Swatches
One of the most important home renovation decisions is paint color. Most do-it-yourselfers wouldn’t think of grabbing a can of paint off the shelf to paint an entire room, without first comparing and determining the best color using paint swatches, or even samples.
When it comes to CMS submission, consider the Encounter Data Front End System (EDFES) test environment your paint chips. Take advantage of testing your data in the system to ensure correct submission practice. Don’t get stuck with a color you do not like.
It’s a good idea, particularly if you are a new submitter, to iteratively use the CMS Tier II testing as you perform your QA and assess your readiness for production status. Just as there is no limit to the paint swatches you can bring home, there is no limit to using this testing system.
You should not begin submitting to CMS until you have achieved at least 95% acceptance to MAO-002 in Tier II. Remember – once you start submitting to production, your acceptance rates “count against you” and you are likely to see yellow on your report card and/or letters from CMS if you are an outlier from the start.
Just as having a good eye for color helps in home design, you want to make sure to have a talented analyst on your team with a good understanding of Electronic Data Interchange (EDI). Think of this as having a trusted second pair of eyes to ensure that you are choosing the best color.
Here are some important tips for your analyst:
- Write and execute test cases – Your analyst should be able to write and execute test cases that – at the very least – include any encounter scenarios that are unique to you and/or typically cause problems when they come in as electronic claims from the providers.
- Interpret and communicate results – Your analyst should also be able to properly interpret responses to ensure that results are effectively communicated to leadership so that necessary system changes are made.
- Close gaps based on the analyst’s findings – That often entails aligning edits on inbound claims volume from providers with what CMS requires for encounters. Or, if you are using a clearinghouse, to determine where the gaps are so that they can be closed at the clearinghouse. A good business analyst with EDI experience can do this by examining the rejections from Tier II against the rules that exist in the claims system or clearinghouse that are applied to claims submitted from providers.
CMS Best Practice Tool #8: Flashlight
One of the simplest, yet most effective tools to have in any toolbox is a flashlight. Illuminating a work area can be the key to success, and one that doesn’t cost a lot, yet nets the greatest results.
In your data submission program review, that flashlight is transactional monitoring across the different internal systems involved in creating and submitting your encounter data. This “monitoring” spotlight confirms the integrity of your data throughout the process.
In addition to the flashlight, supplemental tools help this illumination process:
- Map – All points of accountability must be clearly identified; flow diagrams work well for this. The people responsible for those systems are the ones who need a seat at the table for internal data governance.
- Scorecard – It’s also important to know how to measure and provide meaningful feedback if those systems are not performing correctly. That is why setting appropriate internal metrics is key. For these processes, a good start is reconciliation – confirm that the output of one system makes it into the next and if it was the expected output. If it wasn’t, you need to know what happened and how to immediately fix it. Don’t wait until you are hemorrhaging money due to revenue leakage at the various points of accountability.
CMS Best Practice Tool #9: Nail Gun
A nail gun revolutionized many home improvement projects, providing a quicker and more efficient way to nail. If you have a nail gun, why use a hammer?
Similarly, with EDI systems, why submit paper claims? Old habits die hard. However, getting providers to reduce their paper claim submissions is essential in reducing errors and administrative burden.
Most providers submit upwards of 90% of their data via EDI, so this is not as much of an issue for health plans as it has been in the past. However, if eliminating paper claims is not possible, you must do the following to ensure that data issues that are unique to them do not affect the success of your operations:
- Flag errors – Ensure that when you perform root cause analysis of the error reports coming back from CMS that you clearly ascertain which ones had an origin of paper. If there is no way to distinguish them, then implement one, such as a flag or other means of identifying them. You’ll need to interact with the claims manager, and/or the person in charge of claims scanning and optical character recognition (OCR) workflow to address these issues.
- Identify root causes – For these encounters, there are a few typical root causes, assuming that the data elements required for the 837, but not for the paper HCFA-1500 or UB-04, have been addressed and are present in the system. If this is not the case, you have a whole other set of problems entirely.
One common cause:
- Optical character recognition (OCR) errors – For a sample, do a comparison against the image of the claim that came in from the provider, and what was actually scanned into the system. And, if applicable, what was then verified by an actual human. It may be that the sensitivity of the OCR needs to be adjusted (with more human verification), or it’s possible inadequate QA is being done on these claims.
- Perform a cost analysis – It is important to note that processing paper claims has many costs that far outweigh EDI. It has been noted that full EDI lowers transactional costs by 35% (when you consider mailroom, scanning, storage, etc.). If you add to this the lost risk adjustment revenue of those rejected or delayed claims (especially if they happen to contain diagnoses that map to an HCC) and you can more than justify the need to more aggressively address paper claims.
- Do a provider assessment – When was the last time you did an assessment of the providers who send claims on paper? Did you know that CMS forces paper submitters to justify why they are using paper and not EDI for fee-for-service submissions? Starting in October 2003, CMS required all claims to be submitted electronically in fee-for-service, with very limited exceptions (Medicare Claims Processing Manual, (Pub.100-04), Chapter 24, Section 90).
In summary, the blueprint for reducing data errors is to use the right tools to thoroughly investigate your data and system. Use paint swatches and test your data in the test environment, get out that flashlight for transactional monitoring across your internal systems and grab that nail gun to ensure the use of EDI. When it’s not being used, nail down the reasons why!
In my final article, I will explore how to put your best CMS practices on repeat.
Contact us today to learn how our consultative advisory approach can add to your CMS data toolbox.