Part one of a four-part series of CMS Best Practices Implementation
Centauri Health Solutions’ Director of Product Strategy – Dawn Carter
Have you ever watched a HGTV DIY show? If so, you know that watching teams renovate a lackluster house with the proper tools, inspires viewers to get out their own tools and create a renovation success story. Like any HGTV transformation – in Risk Adjustment, the right tools need to be used to construct and implement successful data submission programs.
Relating renovations to data submission and risk adjustment programs for commercial and government programs, the foundation of any risk adjustment program, regardless of the line of business, is the effective submission of complete and accurate data.
Anyone who has ever read my articles or heard me speak knows, that I strongly believe that without this foundation, health plans risk being out of compliance with requirements to submit data as well as inaccurate risk-adjusted payments.
This is the first of four articles where I will reference the 12 CMS best practices and provide actionable strategies to ensure successful implementation of best practices. I do this using a “toolbox” theme to identify each tool or practice, you will need for implementation.
If you recall, the CMS Best Practices were originally published in June 2017. However, you will find that for ease of discussion, I’ve present them in a way that is perhaps more intuitive, rather than one by one as CMS did in its publication.
Where to start?
CMS Best Practice Tool #1: The Magic Bullet
You want to know what the magic bullet is for preventing errors in the first place?
Of course, you do.
Review the 999, 277CA and the MAO-002 transactional reports to track, prevent, and resolve errors. Ensure that your edits, or the edits of your clearinghouse or whoever gets the data first and is closest to the provider, are in complete alignment with CMS’ edits. If I had to pick one and only one process for plans to focus on, it would be this coupled with the right person or tool to do it. Simple, right? Not really. But here’s where you start.
Begin with Analysis
Carefully reviewing and reconciling the 999, 277CA and the MAO-002 transactional reports for every submission to track, prevent, and resolve errors is certainly no easy task without the right knowledge and skills. Therefore, reviewing and reconciling preferably happens with a tool or other business intelligence that automates these processes, though you still need a person to interpret the output.
Then, based on this analysis, ensure that clearinghouse edits or health plan edits are in alignment with what CMS requires so that errors do not occur again. Problem solved, right?
Not so fast.
What if you can’t make changes to these edits to bring them in alignment with what CMS requires? Or worse, you have no automated tool or business intelligence for reconciling the 999, 277CA and MAO-002 reports with the originally submissions. Then your job is much harder and you will likely need extensive technical assistance or a vendor to assist.
Root Cause Analysis Decision Tree
Do your providers send the data to a clearinghouse? If so, why is the clearinghouse not applying the same edits as CMS for encounter data?
Do not allow the excuse that the clearinghouse is not programmed to apply these edits. You are paying them to help manage data quality; hold them accountable for that. If they can’t, won’t, or it takes them a ridiculously long time to implement them, find another vendor that will by asking “Does your edit engine accommodate the CMS Medicare Advantage EDPS edits, or is it configurable to be able to implement them?”.
Neither should you allow provider abrasion to be an excuse for not implementing these edits. We will cover that in another part of this series.
Is the source data being sent electronically by the providers/clearinghouse?
Most health plans now require the submission of electronic data, and like CMS, require attestation by providers if they are unable to comply with electronic data interchange (EDI) requirements. Most health plans have EDI at 95% of total submissions. Consider tighter controls over who can submit on paper, and when.
If no – why?
Is any of the volume of paper origin?
Paper origination is often the culprit of missing data elements that need to be captured in the claim processing system. Again, paper claims should be a small percentage of overall submission volume. The more paper volume there is, the more likely there will be further data quality issues. Even with high quality paper-to-EDI conversion there is still a high risk for errors. The more manual intervention in a process, the higher margin for error.
If yes – where is it being stored, and where is the source-to-target mapping to the outbound encounter submissions?
It is possible that encounter data extractions are not sourcing data correctly.
If that is not the issue, consider these additional scenarios that commonly plague EDI submissions and affect what gets sent outbound to CMS:
- Missing required data errors from paper claims? For the volume of encounters of paper origin, review the paper claims processing procedures, inclusive of scanning and verification. Often these processes are flawed because they are manual, and will require more stringent QA. Consider increasing the sensitivity of the OCR or adding additional claims QA staff if this is an issue.
- Are your internal codesets up to date? Do you have the dedicated resource needed to maintain the latest ICD10, HCPCS/CPT, USPS, NUBC, WPC and other codesets? Are they loaded for correct validation? Such maintenance is easily a part-time job for a data operations resource, so if this amount of time is not being dedicated to this maintenance you likely have a problem. If you are using a clearinghouse for claims processing or vendors for encounter submissions, be sure to closely monitor and report back to them any invalid codes that are identified on CMS response reporting. Be knowledgeable about the codeset update schedule and request proof that it has been done and QAed successfully. Yes, it’s THAT important and is a low-hanging fruit for error prevention strategies.
- Enrollment errors got you down? Consider internal processes for checking enrollment against the MMR and resolving discrepancies in the HICN, date of birth, gender, general Part C eligibility, etc. for the member reported on each encounter prior to submission. If you or your encounter data submissions vendor does not do this, they should. This type of process exists for RAPS submissions and can be easily scaled for EDPS.
- Provider NPI issues a problem? Are you validating these and your provider registry against the monthly NPPES file that CMS produces monthly, and updating your systems accordingly when discrepancies are found and addressed? If not, you or your clearinghouse should be. This should be an accountability of the provider contracting and credentialing department.
- Is internal staff working EDI rejections on inbound submissions? If not, why not? What is the turnaround time for working these rejections? Also, high numbers of rejections indicate that provider education might be in order, or the clearinghouse needs adjustment to requirements and/or edits.
CMS Best Practice Tool #2: Highlighter
Did you know that, even if you are not identified as out of compliance or an outlier on reports, you might still have issues with complete and accurate data?
- Review report cards with a highlighter to improve understanding of submissions at the contract-level. Unlike in the pre-EDPS world of RAPS prior to 2012, CMS has clearly defined metrics for the completeness and quality of data. How well a plan is meeting these metrics is detailed on quarterly report cards issued to each plan.
- My best two pieces of advice for reviewing these reports card are as follows:
- Define your “encounter universe”. Hint: It isn’t simply what is in-house and eligible for submission to CMS because it is paid or denied. Before you determine the universe, you need to have a good understanding of what you consider to be the denominator that represents the claims universe as well as, what accuracy percentage is acceptable to your organization (the total submitted/total accepted).For example, claims volume that is rejected by the clearinghouse, held in Pend status, sent out for external review, etc. may not show up in the denominator and give a plan a false sense of submitting complete data when they really aren’t.Look at every input into the universe from the start of the revenue cycle to the end; not just what resulted in adjudication as paid or denied that will get picked up for submission to CMS. CMS might not pick up on this as an outlier, compared to other cohort plans, but your payments will certainly reflect the incomplete data set. Also, incorrectly accounting for duplicates and adjustments can affect the accuracy of your universe as can incorrectly accounting for supplemental data submissions.
- Have an “early warning system” in place. Identify issues before they manifest themselves in yellow. Don’t wait until you show up in yellow to address issues – being in the yellow should NEVER be a surprise. You may not have been identified in yellow as an outlier, but this does NOT mean that you are doing a good job with submissions. Internal audits will help establish and monitor status.
- My best two pieces of advice for reviewing these reports card are as follows:
CMS Best Practice Tool #3 Blueprint
You wouldn’t dream of beginning to build a structure without a design blueprint, and this is no different.
Establish regularly scheduled meetings that include all internal and external parties involved in the creation and submission of encounter data to achieve an integrated and high-quality process.
The key to doing this is accountable partnerships. There must be clearly defined and documented deployment of all people, processes and technology, and clear definition of accountabilities. Your blueprint includes the following:
- Inclusivity. Include all internal and external parties involved in the creation and submission of encounter data. You will also need to include vendors and clearinghouses to achieve an integrated and high-quality process, as well as those who represent providers (such as provider relations).
- Visibility. Some vendors do not give health plans clear visibility into the status of their own operations. External resources often make it difficult for health plans to hold them accountable. To that I say: they are being employed and paid to provide a product and service – hold them accountable. It’s your data and ultimately you pay the price for its incompleteness or poor quality.
- Properly staffed. Another complaint I hear is that vendors and clearinghouses have long change control processes. If this is something that you encounter, you must address this issue and ensure that they are resourced appropriately and can properly support their software development lifecycle.
- More is not better. A whole army of the wrong people can do more harm than good. This is particularly true for those participating in the monthly CMS calls, which should only be attended by your core team members.The following people should make up your core team:
- Risk adjustment lead
- Data submission lead, ideally reporting to the overall risk adjustment delivery owner
- Lead subject matter expert for data submission and risk adjustment
- Business analyst to assist to assemble a summary and propose impacts to people, processes and technology, allowing attendees to focus on listening and understanding.Regular team meetings should include core team members, plus other identified stakeholders, such as an IT representative, claims/enrollment/provider departments, depending upon the operational structure. In another part of this series, I will address choosing the right team for your unique operational needs and structure.
- Empower participants. It does no good to have meetings with participants who are not empowered to make necessary changes. Usually the person with overall responsibility for risk adjustment operations, needs to be able to keep the group’s focus on what matters most. This person best understands how much incomplete and inaccurate data costs the organization in risk-adjusted dollars.The risk adjustment lead is often tasked with providing necessary justification to leadership for needed resources. The ability to make changes, if necessary, to achieve an integrated and quality process should be a given. Otherwise, what is the point of being held accountable for knowing what encounter rejections could cost in risk-adjusted dollars— at ALL TIMESUltimately, a health plan’s internal audit team needs to ensure that they include encounter data submission in its purview. Periodic objective evaluation, both on a set schedule as well as randomly, is beneficial and necessary to ensure an integrated and high-quality process, and the internal audit department can serve as this objective evaluator using the blueprint as its source of truth.
In conclusion, if you find yourself without a magic bullet, a yellow highlighter or a map as you attempt your own risk adjustment “renovation”, know that there are tools to show you where you stand in comparison to best practices and to help you address any gaps. Contact us today to learn how our consultative advisory approach can assist your organization.
Need assistance with your Risk Adjustment Program? Contact us to experience a Product Demo 888.447.8908 • firstname.lastname@example.org
About Centauri Health Solutions
Centauri Health Solutions focuses on revealing care opportunities through its suite of products and services. Delivering data-driven services, through private cloud-based software solutions, the firm provides comprehensive data management designed specifically for risk adjustment and quality-based revenue programs, in addition to enrollment and eligibility solutions. Centauri improves member outcomes and financial performance for health plans, hospitals and at-risk providers by supporting initiatives in health care enrollment, risk adjustment, RADV risk mitigation, HEDIS and Star Ratings. Headquartered in Scottsdale, Ariz., Centauri Health Solutions also has offices in Cleveland, Ohio, Nashville, Tenn., and Orlando, Fla. For more information, visit www.centaurihs.com.