Medicare and Medicaid Programs; CY 2016 Home Health Prospective Payment System Rate Update; Home Health Value-Based Purchasing Model; and Home Health Quality Reporting Requirements

Download PDF
Federal RegisterJul 10, 2015
80 Fed. Reg. 39839 (Jul. 10, 2015)

AGENCY:

Centers for Medicare & Medicaid Services (CMS), HHS.

ACTION:

Proposed rule.

SUMMARY:

This proposed rule would update Home Health Prospective Payment System (HH PPS) rates, including the national, standardized 60-day episode payment rates, the national per-visit rates, and the non-routine medical supply (NRS) conversion factor under the Medicare prospective payment system for home health agencies (HHAs), effective for episodes ending on or after January 1, 2016. As required by the Affordable Care Act, this proposed rule implements the third year of the four-year phase-in of the rebasing adjustments to the HH PPS payment rates. This proposed rule provides information on our efforts to monitor the potential impacts of the rebasing adjustments. This proposed rule also proposes: reductions to the national, standardized 60-day episode payment rate in CY 2016 and CY 2017 of 1.72 percent in each year to account for estimated case-mix growth unrelated to increases in patient acuity (nominal case-mix growth) between CY 2012 and CY 2014; a HH value-based purchasing (HHVBP) model to be implemented beginning January 1, 2016 in which all Medicare-certified HHAs in selected states will be required to participate; changes to the home health quality reporting program requirements; and minor technical regulations text changes. Finally, this proposed rule would update the HH PPS case-mix weights using the most current, complete data available at the time of rulemaking and provide an update on the Report to Congress regarding the home health (HH) study.

DATES:

To be assured consideration, comments must be received at one of the addresses provided below, no later than 5 p.m. on September 4, 2015.

ADDRESSES:

In commenting, please refer to file code CMS-1625-P. Because of staff and resource limitations, we cannot accept comments by facsimile (FAX) transmission.

You may submit comments in one of four ways (please choose only one of the ways listed):

1. Electronically. You may submit electronic comments on this regulation to http://www.regulations.gov. Follow the instructions under the “More Search Options” tab.

2. By regular mail. You may mail written comments to the following address ONLY: Centers for Medicare & Medicaid Services, Department of Health and Human Services, Attention: CMS-1625-P, P.O. Box 8016, Baltimore, MD 21244-8016.

Please allow sufficient time for mailed comments to be received before the close of the comment period.

3. By express or overnight mail. You may send written comments to the following address ONLY: Centers for Medicare & Medicaid Services, Department of Health and Human Services, Attention: CMS-1625-P, Mail Stop C4-26-05, 7500 Security Boulevard, Baltimore, MD 21244-1850.

4. By hand or courier. If you prefer, you may deliver (by hand or courier) your written comments before the close of the comment period to either of the following addresses:

a. For delivery in Washington, DC—Centers for Medicare & Medicaid Services, Department of Health and Human Services, Room 445-G, Hubert H. Humphrey Building, 200 Independence Avenue SW., Washington, DC 20201.

(Because access to the interior of the Hubert H. Humphrey Building is not readily available to persons without federal government identification, commenters are encouraged to leave their comments in the CMS drop slots located in the main lobby of the building. A stamp-in clock is available for persons wishing to retain a proof of filing by stamping in and retaining an extra copy of the comments being filed.)

b. For delivery in Baltimore, MD—Centers for Medicare & Medicaid Services, Department of Health and Human Services, 7500 Security Boulevard, Baltimore, MD 21244-1850.

If you intend to deliver your comments to the Baltimore address, please call (410) 786-7195 in advance to schedule your arrival with one of our staff members.

Comments mailed to the addresses indicated as appropriate for hand or courier delivery may be delayed and received after the comment period.

For information on viewing public comments, see the beginning of the SUPPLEMENTARY INFORMATION section.

FOR FURTHER INFORMATION CONTACT:

Hillary Loeffler, (410) 786-0456, for general information about the HH PPS.

Michelle Brazil, (410) 786-1648 for information about the HH quality reporting program.

Lori Teichman, (410) 786-6684, for information about HHCAHPS.

Robert Flemming, (844) 280-5628, for information about the HHVBP model.

SUPPLEMENTARY INFORMATION:

Inspection of Public Comments: All comments received before the close of the comment period are available for viewing by the public, including any personally identifiable or confidential business information that is included in a comment. We post all comments received before the close of the comment period on the following Web site as soon as possible after they have been received: http://www.regulations.gov. Follow the search instructions on that Web site to view public comments.

Comments received timely will also be available for public inspection as they are received, generally beginning approximately 3 weeks after publication of a document, at the headquarters of the Centers for Medicare & Medicaid Services, 7500 Security Boulevard, Baltimore, Maryland 21244, Monday through Friday of each week from 8:30 a.m. to 4 p.m. EST.

To schedule an appointment to view public comments, phone 1-800-743-3951.

Table of Contents

I. Executive Summary

A. Purpose

B. Summary of the Major Provisions

C. Summary of Costs and Benefits

II. Background

A. Statutory Background

B. System for Payment of Home Health Services

C. Updates to the Home Health Prospective Payment System

D. Advancing Health Information Exchange

III. Proposed Provisions of the Home Health Prospective Payment System

A. Monitoring for Potential Impacts—Affordable Care Act Rebasing Adjustments

B. CY 2016 HH PPS Case-Mix Weights and Proposed Reduction to the National, Standardized 60-Day Episode Payment Rate To Account for Nominal Case-Mix Growth

1. CY 2016 HH PPS Case-Mix Weights

2. Reduction to the National, Standardized 60-Day Episode Payment Rate to Account for Nominal Case-Mix Growth

C. CY 2016 Home Health Rate Update

1. CY 2016 Home Health Market Basket Update

2. CY 2016 Home Health Wage Index

3. CY 2016 Annual Payment Update

a. Background

b. CY 2016 National, Standardized 60-Day Episode Payment Rate

c. CY 2016 National Per-Visit Rates

d. Low-Utilization Payment Adjustment (LUPA) Add-On Factors

e. CY 2016 Nonroutine Medical Supply Payment Rates

f. Rural Add-On

D. Payments for High-Cost Outliers Under the HH PPS

E. Report to Congress on the Home Health Study Required by Section 3131(d) of the Affordable Care Act and an Update on Subsequent Research and Analysis

F. Technical Regulations Text Changes

IV. Proposed Home Health Value-Based Purchasing (HHVBP) Model

V. Proposed Provisions of the Home Health Care Quality Reporting Program (HHQRP)

A. Background and Statutory Authority

B. General Considerations Used for the Selection of Quality Measures for the HH QRP

C. HH QRP Quality Measures and Measures Under Consideration for Future Years

D. Form, Manner, and Timing of OASIS Data Submission and OASIS Data for Annual Payment Update

1. Statutory Authority

2. Home Health Quality Reporting Program Requirements for CY 2016 Payment and Subsequent Years

3. Previously Established Pay-for-Reporting Performance Requirement for Submission of OASIS Quality Data

E. Home Health Care CAHPS Survey (HHCAHPS)

1. Background and Description of HHCAHPS

2. HHCAHPS Oversight Activities

3. HHCAHPS Requirements for the CY 2016 APU

4. HHCAHPS Requirements for the CY 2017 APU

5. HHCAHPS Requirements for the CY 2018 APU

6. HHCAHPS Reconsideration and Appeals Process

7. Summary

F. Public Display of Home Health Quality Data for the HH QRP

VI. Collection of Information Requirements

VII. Response to Comments

VII. Regulatory Impact Analysis

Regulations Text

Acronyms

In addition, because of the many terms to which we refer by abbreviation in this proposed rule, we are listing these abbreviations and their corresponding terms in alphabetical order below:

ACH LOS Acute Care Hospital Length of Stay

ADL Activities of Daily Living

APU Annual Payment Update

BBA Balanced Budget Act of 1997, Pub. L. 105-33

BBRA Medicare, Medicaid, and SCHIP Balanced Budget Refinement Act of 1999, Pub. L. 106-113

CAD Coronary Artery Disease

CAH Critical Access Hospital

CBSA Core-Based Statistical Area

CASPER Certification and Survey Provider Enhanced Reports

CHF Congestive Heart Failure

CMI Case-Mix Index

CMP Civil Money Penalty

CMS Centers for Medicare & Medicaid Services

CoPs Conditions of Participation

COPD Chronic Obstructive Pulmonary Disease

CVD Cardiovascular Disease

CY Calendar Year

DM Diabetes Mellitus

DRA Deficit Reduction Act of 2005, Pub. L. 109-171, enacted February 8, 2006

FDL Fixed Dollar Loss

FI  Fiscal Intermediaries

FR Federal Register

FY Fiscal Year

HAVEN Home Assessment Validation and Entry System

HCC Hierarchical Condition Categories

HCIS Health Care Information System

HH Home Health

HHA Home Health Agency

HHCAHPS Home Health Care Consumer Assessment of Healthcare Providers and Systems Survey

HH PPS Home Health Prospective Payment System

HHRG Home Health Resource Group

HHVBP Home Health Value-Based Purchasing

HIPPS Health Insurance Prospective Payment System

HVBP Hospital Value-Based Purchasing

ICD-9-CM International Classification of Diseases, Ninth Revision, Clinical Modification

ICD-10-CM International Classification of Diseases, Tenth Revision, Clinical Modification

IH Inpatient Hospitalization

IMPACT Act Improving Medicare Post-Acute Care Transformation Act of 2014 (P.L. 113-185)

IRF Inpatient Rehabilitation Facility

LEF Linear Exchange Function

LTCH Long-Term Care Hospital

LUPA Low-Utilization Payment Adjustment

MEPS Medical Expenditures Panel Survey

MMA Medicare Prescription Drug, Improvement, and Modernization Act of 2003, Pub. L. 108-173, enacted December 8, 2003

MSA Metropolitan Statistical Area

MSS Medical Social Services

NQF National Quality Forum

NQS National Quality Strategy

NRS Non-Routine Supplies

OASIS Outcome and Assessment Information Set

OBRA Omnibus Budget Reconciliation Act of 1987, Pub. L. 100-2-3, enacted December 22, 1987

OCESAA Omnibus Consolidated and Emergency Supplemental Appropriations Act, Pub. L. 105-277, enacted October 21, 1998

OES Occupational Employment Statistics

OIG Office of Inspector General

OT Occupational Therapy

OMB Office of Management and Budget

MFP Multifactor productivity

PAMA Protecting Access to Medicare Act of 2014

PAC-PRD Post-Acute Care Payment Reform Demonstration

PEP Partial Episode Payment Adjustment

PT Physical Therapy

PY Performance Year

PRRB Provider Reimbursement Review Board

QAP Quality Assurance Plan

RAP Request for Anticipated Payment

RF Renal Failure

RFA Regulatory Flexibility Act, Pub. L. 96-354

RHHIs Regional Home Health Intermediaries

RIA Regulatory Impact Analysis

SAF Standard Analytic File

SLP Speech-Language Pathology

SN Skilled Nursing

SNF Skilled Nursing Facility

TPS Total Performance Score

UMRA Unfunded Mandates Reform Act of 1995.

VBP Value-Based Purchasing

I. Executive Summary

A. Purpose

This proposed rule would update the payment rates for HHAs for calendar year (CY) 2016, as required under section 1895(b) of the Social Security Act (the Act). This would reflect the third year of the four-year phase-in of the rebasing adjustments to the national, standardized 60-day episode payment rate, the national per-visit rates, and the NRS conversion factor finalized in the CY 2014 HH PPS final rule (78 FR 72256), as required under section 3131(a) of the Patient Protection and Affordable Care Act of 2010 (Pub. L. 111-148), as amended by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111-152) (collectively referred to as the “Affordable Care Act”).

This proposed rule also discusses our efforts to monitor the potential impacts of the rebasing adjustments mandated by section 3131(a) of the Affordable Care Act. This rule proposes: Reductions to the national, standardized 60-day episode payment rate in CY 2016 and CY 2017 of 1.72 percent in each year to account for case-mix growth unrelated to increases in patient acuity (nominal case-mix growth) between CY 2012 and CY 2014 under the authority of section 1895(b)(3)(B)(iv) of the Act; a HH Value-Based Purchasing (VBP) model, in which certain Medicare-certified HHAs would be required to participate beginning January 1, 2016, under the authority of section 1115(A) of the Act; changes to the home health quality reporting program requirements under section 1895(b)(3)(B)(v)(II) of the Act; and minor technical regulations text changes in 42 CFR parts 409, 424, and 484 to better align the payment requirements with recent statutory and regulatory changes for home health services. Finally, this proposed rule would update the case-mix weights under section 1895(b)(4)(A)(i) and (b)(4)(B) of the Act and provide an update on the Report to Congress regarding the HH study required by section 3131(d) of the Affordable Care Act.

B. Summary of the Major Provisions

As required by section 3131(a) of the Affordable Care Act, and finalized in the CY 2014 HH final rule, “Medicare and Medicaid Programs; Home Health Prospective Payment System Rate Update for 2014, Home Health Quality Reporting Requirements, and Cost Allocation of Home Health Survey Expenses” (78 FR 77256, December 2, 2013), we are implementing the third year of the four-year phase-in of the rebasing adjustments to the national, standardized 60-day episode payment amount, the national per-visit rates and the NRS conversion factor in section III.C.3. The rebasing adjustments for CY 2016 would reduce the national, standardized 60-day episode payment amount by $80.95, increase the national per-visit payment amounts by 3.5 percent of the national per-visit payment amounts in CY 2010 with the increases ranging from $1.79 for home health aide services to $6.34 for medical social services, and reduce the NRS conversion factor by 2.82 percent.

This proposed rule also discusses our efforts to monitor the potential impacts of the rebasing adjustments in section III.A. In the CY 2015 HH PPS final rule (79 FR 66072), we finalized our proposal to recalibrate the case-mix weights every year with more current data. In section III.B.1 of this rule, we are recalibrating the HH PPS case-mix weights, using the most current cost and utilization data available, in a budget neutral manner. In addition, in section III.B.2 of this rule, we propose to reduce to the national, standardized 60-day episode payment rate in CY 2016 and CY 2017 by 1.72 percent in each year to account for estimated case-mix growth unrelated to increases in patient acuity (nominal case-mix growth) between CY 2012 and CY 2014. In section III.C.1 of this rule, we propose to update the payment rates under the HH PPS by the home health payment update percentage of 2.3 percent (using the 2010-based Home Health Agency (HHA) market basket update of 2.9 percent, minus 0.6 percentage point for productivity as required by section 1895(b)(3)(B)(vi)(I) of the Act. In the CY 2015 final rule (79 FR 66083 through 66087), we incorporated new geographic area designations, set out in a February 28, 2013 office of Management and Budget (OMB) bulletin, into the home health wage index. For CY 2015, we implemented a wage index transition policy consisting of a 50/50 blend of the old geographic area delineations and the new geographic area delineations. In section III.C.2 of this proposed rule, we propose to update the CY 2016 home health wage index using solely the new geographic area designations. In section III.D of this proposed rule, we discuss payments for high cost outliers. In section III.E, we propose to make several technical corrections in § 409, 424, and § 484 to better align the payment requirements with recent statutory and regulatory changes for home health services. The sections include § 409.43(e), § 424.22(a), § 484.205(d), § 484.205(e), § 484.220, § 484.225, § 484.230, § 484.240(b), § 484.240(e), § 484.240(f), § 484.245. In section III.F, we discuss the Report to Congress on the home health study required by section 3131(d) of the Affordable Care Act and provide an update on subsequent research and analysis.

In section IV of this proposed rule, we propose a HHVBP model to be implemented beginning January 1, 2016. Medicare-certified HHAs selected for inclusion in the HHVBP model would be required to compete for payment adjustments to their current PPS reimbursements based on quality performance. A competing Medicare-certified HHA is defined as an agency having a current Medicare certification and which is being reimbursed by CMS for home health care delivered within any of the nine states randomly selected under CMS' proposed selection methodology.

This proposed rule also includes changes to the home health quality reporting program in section III.V, including the proposal of one new quality measure, the establishment of a minimum threshold for submission of Outcome and Assessment Information Set (OASIS) assessments for purposes of quality reporting compliance, and submission dates for Home Health Care Consumer Assessment of Healthcare Providers and Systems Survey (HHCAHPS) Survey through CY 2018.

C. Summary of Costs and Transfers

Table 1—Summary of Costs and Transfers

Provision description Costs Transfers
CY 2016 HH PPS Payment Rate Update The overall economic impact of the HH PPS payment rate update is an estimated −$350 million (−1.8 percent) in payments to HHAs.
CY 2016 HHVBP Model The overall economic impact of the HHVBP model provision for CY 2018 through 2022 is an estimated $380 million in total savings from a reduction in unnecessary hospitalizations and SNF usage as a result of greater quality improvements in the HH industry. As for payments to HHAs, there are no aggregate increases or decreases to the HHAs competing in the model.

II. Background

A. Statutory Background

The Balanced Budget Act of 1997 (BBA) (Pub. L. 105-33, enacted August 5, 1997), significantly changed the way Medicare pays for Medicare HH services. Section 4603 of the BBA mandated the development of the HH PPS. Until the implementation of the HH PPS on October 1, 2000, HHAs received payment under a retrospective reimbursement system.

Section 4603(a) of the BBA mandated the development of a HH PPS for all Medicare-covered HH services provided under a plan of care (POC) that were paid on a reasonable cost basis by adding section 1895 of the Social Security Act (the Act), entitled “Prospective Payment For Home Health Services.” Section 1895(b)(1) of the Act requires the Secretary to establish a HH PPS for all costs of HH services paid under Medicare.

Section 1895(b)(3)(A) of the Act requires the following: (1) The computation of a standard prospective payment amount include all costs for HH services covered and paid for on a reasonable cost basis and that such amounts be initially based on the most recent audited cost report data available to the Secretary; and (2) the standardized prospective payment amount be adjusted to account for the effects of case-mix and wage levels among HHAs.

Section 1895(b)(3)(B) of the Act addresses the annual update to the standard prospective payment amounts by the HH applicable percentage increase. Section 1895(b)(4) of the Act governs the payment computation. Sections 1895(b)(4)(A)(i) and (b)(4)(A)(ii) of the Act require the standard prospective payment amount to be adjusted for case-mix and geographic differences in wage levels. Section 1895(b)(4)(B) of the Act requires the establishment of an appropriate case-mix change adjustment factor for significant variation in costs among different units of services.

Similarly, section 1895(b)(4)(C) of the Act requires the establishment of wage adjustment factors that reflect the relative level of wages, and wage-related costs applicable to HH services furnished in a geographic area compared to the applicable national average level. Under section 1895(b)(4)(C) of the Act, the wage-adjustment factors used by the Secretary may be the factors used under section 1886(d)(3)(E) of the Act.

Section 1895(b)(5) of the Act gives the Secretary the option to make additions or adjustments to the payment amount otherwise paid in the case of outliers due to unusual variations in the type or amount of medically necessary care. Section 3131(b)(2) of the Patient Protection and Affordable Care Act of 2010 (the Affordable Care Act) (Pub. L. 111-148, enacted March 23, 2010) revised section 1895(b)(5) of the Act so that total outlier payments in a given year would not exceed 2.5 percent of total payments projected or estimated. The provision also made permanent a 10 percent agency-level outlier payment cap.

In accordance with the statute, as amended by the BBA, we published a final rule in the July 3, 2000 Federal Register (65 FR 41128) to implement the HH PPS legislation. The July 2000 final rule established requirements for the new HH PPS for HH services as required by section 4603 of the BBA, as subsequently amended by section 5101 of the Omnibus Consolidated and Emergency Supplemental Appropriations Act (OCESAA) for Fiscal Year 1999, (Pub. L. 105-277, enacted October 21, 1998); and by sections 302, 305, and 306 of the Medicare, Medicaid, and SCHIP Balanced Budget Refinement Act (BBRA) of 1999, (Pub. L. 106-113, enacted November 29, 1999). The requirements include the implementation of a HH PPS for HH services, consolidated billing requirements, and a number of other related changes. The HH PPS described in that rule replaced the retrospective reasonable cost-based system that was used by Medicare for the payment of HH services under Part A and Part B. For a complete and full description of the HH PPS as required by the BBA, see the July 2000 HH PPS final rule (65 FR 41128 through 41214).

Section 5201(c) of the Deficit Reduction Act of 2005 (DRA) (Pub. L. 109-171, enacted February 8, 2006) added new section 1895(b)(3)(B)(v) to the Act, requiring HHAs to submit data for purposes of measuring health care quality, and links the quality data submission to the annual applicable percentage increase. This data submission requirement is applicable for CY 2007 and each subsequent year. If an HHA does not submit quality data, the HH market basket percentage increase is reduced by 2 percentage points. In the November 9, 2006 Federal Register (71 FR 65884, 65935), we published a final rule to implement the pay-for-reporting requirement of the DRA, which was codified at § 484.225(h) and (i) in accordance with the statute. The pay-for-reporting requirement was implemented on January 1, 2007.

The Affordable Care Act made additional changes to the HH PPS. One of the changes in section 3131 of the Affordable Care Act is the amendment to section 421(a) of the Medicare Prescription Drug, Improvement, and Modernization Act of 2003 (MMA) (Pub. L. 108-173, enacted on December 8, 2003) as amended by section 5201(b) of the DRA. Section 421(a) of the MMA, as amended by section 3131 of the Affordable Care Act, requires that the Secretary increase, by 3 percent, the payment amount otherwise made under section 1895 of the Act, for HH services furnished in a rural area (as defined in section 1886(d)(2)(D) of the Act) with respect to episodes and visits ending on or after April 1, 2010, and before January 1, 2016. Section 210 of the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) (Pub. L. 114-10) amended section 421(a) of the MMA to extend the rural add-on for two more years. Section 421(a) of the MMA, as amended by section 210 of the MACRA, requires that the Secretary increase, by 3 percent, the payment amount otherwise made under section 1895 of the Act, for HH services provided in a rural area (as defined in section 1886(d)(2)(D) of the Act) with respect to episodes and visits ending on or after April 1, 2010, and before January 1, 2018.

B. System for Payment of Home Health Services

Generally, Medicare makes payment under the HH PPS on the basis of a national standardized 60-day episode payment rate that is adjusted for the applicable case-mix and wage index. The national standardized 60-day episode rate includes the six HH disciplines (skilled nursing, HH aide, physical therapy, speech-language pathology, occupational therapy, and medical social services). Payment for non-routine supplies (NRS) is no longer part of the national standardized 60-day episode rate and is computed by multiplying the relative weight for a particular NRS severity level by the NRS conversion factor (See section II.D.4.e). Payment for durable medical equipment covered under the HH benefit is made outside the HH PPS payment system. To adjust for case-mix, the HH PPS uses a 153-category case-mix classification system to assign patients to a home health resource group (HHRG). The clinical severity level, functional severity level, and service utilization are computed from responses to selected data elements in the OASIS assessment instrument and are used to place the patient in a particular HHRG. Each HHRG has an associated case-mix weight which is used in calculating the payment for an episode.

For episodes with four or fewer visits, Medicare pays national per-visit rates based on the discipline(s) providing the services. An episode consisting of four or fewer visits within a 60-day period receives what is referred to as a low-utilization payment adjustment (LUPA). Medicare also adjusts the national standardized 60-day episode payment rate for certain intervening events that are subject to a partial episode payment adjustment (PEP adjustment). For certain cases that exceed a specific cost threshold, an outlier adjustment may also be available.

C. Updates to the Home Health Prospective Payment System

As required by section 1895(b)(3)(B) of the Act, we have historically updated the HH PPS rates annually in the Federal Register. The August 29, 2007 final rule with comment period set forth an update to the 60-day national episode rates and the national per-visit rates under the HH PPS for CY 2008. The CY 2008 HH PPS final rule included an analysis performed on CY 2005 HH claims data, which indicated a 12.78 percent increase in the observed case-mix since 2000. Case-mix represents the variations in conditions of the patient population served by the HHAs. Subsequently, a more detailed analysis was performed on the 2005 case-mix data to evaluate if any portion of the 12.78 percent increase was associated with a change in the actual clinical condition of HH patients. We examined data on demographics, family severity, and non-HH Part A Medicare expenditures to predict the average case-mix weight for 2005. We identified 8.03 percent of the total case-mix change as real, and therefore, decreased the 12.78 percent of total case-mix change by 8.03 percent to get a final nominal case-mix increase measure of 11.75 percent (0.1278 * (1−0.0803) = 0.1175).

To account for the changes in case-mix that were not related to an underlying change in patient health status, we implemented a reduction, over 4 years, to the national, standardized 60-day episode payment rates. That reduction was to be 2.75 percent per year for 3 years beginning in CY 2008 and 2.71 percent for the fourth year in CY 2011. In the CY 2011 HH PPS final rule (76 FR 68532), we updated our analyses of case-mix change and finalized a reduction of 3.79 percent, instead of 2.71 percent, for CY 2011 and deferred finalizing a payment reduction for CY 2012 until further study of the case-mix change data and methodology was completed.

In the CY 2012 HH PPS final rule (76 FR 68526), we updated the 60-day national episode rates and the national per-visit rates. In addition, as discussed in the CY 2012 HH PPS final rule (76 FR 68528), our analysis indicated that there was a 22.59 percent increase in overall case-mix from 2000 to 2009 and that only 15.76 percent of that overall observed case-mix percentage increase was due to real case-mix change. As a result of our analysis, we identified a 19.03 percent nominal increase in case-mix. At that time, to fully account for the 19.03 percent nominal case-mix growth identified from 2000 to 2009, we finalized a 3.79 percent payment reduction in CY 2012 and a 1.32 percent payment reduction for CY 2013.

In the CY 2013 HH PPS final rule (77 FR 67078), we implemented a 1.32 percent reduction to the payment rates for CY 2013 to account for nominal case-mix growth from 2000 through 2010. When taking into account the total measure of case-mix change (23.90 percent) and the 15.97 percent of total case-mix change estimated as real from 2000 to 2010, we obtained a final nominal case-mix change measure of 20.08 percent from 2000 to 2010 (0.2390 * (1−0.1597) = 0.2008). To fully account for the remainder of the 20.08 percent increase in nominal case-mix beyond that which was accounted for in previous payment reductions, we estimated that the percentage reduction to the national, standardized 60-day episode rates for nominal case-mix change would be 2.18 percent. Although we considered proposing a 2.18 percent reduction to account for the remaining increase in measured nominal case-mix, we finalized the 1.32 percent payment reduction to the national, standardized 60-day episode rates in the CY 2012 HH PPS final rule (76 FR 68532).

Section 3131(a) of the Affordable Care Act requires that, beginning in CY 2014, we apply an adjustment to the national, standardized 60-day episode rate and other amounts that reflect factors such as changes in the number of visits in an episode, the mix of services in an episode, the level of intensity of services in an episode, the average cost of providing care per episode, and other relevant factors. Additionally, we must phase in any adjustment over a four-year period in equal increments, not to exceed 3.5 percent of the amount (or amounts) as of the date of enactment of the Affordable Care Act, and fully implement the rebasing adjustments by CY 2017. The statute specifies that the maximum rebasing adjustment is to be no more than 3.5 percent per year of the CY 2010 rates. Therefore, in the CY 2014 HH PPS final rule (78 FR 72256) for each year, CY 2014 through CY 2017, we finalized a fixed-dollar reduction to the national, standardized 60-day episode payment rate of $80.95 per year, increases to the national per-visit payment rates per year as reflected in Table 2, and a decrease to the NRS conversion factor of 2.82 percent per year. We also finalized three separate LUPA add-on factors for skilled nursing, physical therapy, and speech-language pathology and removed 170 diagnosis codes from assignment to diagnosis groups in the HH PPS Grouper. In the CY 2015 HH PPS final rule (79 FR 66032), we implemented the second year of the four-year phase-in of the rebasing adjustments to the HH PPS payment rates and made changes to the HH PPS case-mix weights. In addition, we simplified the face-to-face encounter regulatory requirements and the therapy reassessment timeframes.

Table 2—Maximum Adjustments to the National Per-Visit Payment Rates

[Not to Exceed 3.5 Percent of the Amount(s) in CY 2010]

2010 National per-visit payment rates Maximum adjustments per year (CY 2014 through CY 2017)
Skilled Nursing $113.01 $3.96
Home Health Aide 51.18 1.79
Physical Therapy 123.57 4.32
Occupational Therapy 124.40 4.35
Speech-Language Pathology 134.27 4.70
Medical Social Services 181.16 6.34

D. Advancing Health Information Exchange

HHS has a number of initiatives designed to encourage and support the adoption of health information technology and to promote nationwide health information exchange to improve health care. As discussed in the August 2013 Statement “Principles and Strategies for Accelerating Health Information Exchange” (available at http://www.healthit.gov/sites/default/files/acceleratinghieprinciples_strategy.pdf ), HHS believes that all individuals, their families, their healthcare and social service providers, and payers should have consistent and timely access to health information in a standardized format that can be securely exchanged between the patient, providers, and others involved in the individual's care. Health IT that facilitates the secure, efficient and effective sharing and use of health-related information when and where it is needed is an important tool for settings across the continuum of care, including home health. While home health providers are not eligible for the Medicare and Medicaid EHR Incentive Programs, effective adoption and use of health information exchange and health IT tools will be essential as these settings seek to improve quality and lower costs through initiatives such as value-based purchasing.

The Office of the National Coordinator for Health Information Technology (ONC) has released a document entitled “Connecting Health and Care for the Nation: A Shared Nationwide Interoperability Roadmap Draft Version 1.0 (draft Roadmap) (available at http://www.healthit.gov/sites/default/files/nationwide-interoperability-roadmap-draft-version-1.0.pdf ) which describes barriers to interoperability across the current health IT landscape, the desired future state that the industry believes will be necessary to enable a learning health system, and a suggested path for moving from the current state to the desired future state. In the near term, the draft Roadmap focuses on actions that will enable a majority of individuals and providers across the care continuum to send, receive, find and use a common set of electronic clinical information at the nationwide level by the end of 2017. The Roadmap's goals also align with the IMPACT Act of 2014 which requires assessment data to be standardized and interoperable to allow for exchange of the data. Moreover, the vision described in the draft Roadmap significantly expands the types of electronic health information, information sources and information users well beyond clinical information derived from electronic health records (EHRs). This shared strategy is intended to reflect important actions that both public and private sector stakeholders can take to enable nationwide interoperability of electronic health information such as: (1) Establishing a coordinated governance framework and process for nationwide health IT interoperability; (2) improving technical standards and implementation guidance for sharing and using a common clinical data set; (3) enhancing incentives for sharing electronic health information according to common technical standards, starting with a common clinical data set; and (4) clarifying privacy and security requirements that enable interoperability.

In addition, ONC has released the draft version of the 2015 Interoperability Standards Advisory (available at http://www.healthit.gov/standards-advisory ), which provides a list of the best available standards and implementation specifications to enable priority health information exchange functions. Providers, payers, and vendors are encouraged to take these “best available standards” into account as they implement interoperable health information exchange across the continuum of care, including care settings such as behavioral health, long-term and post-acute care, and home and community-based service providers.

We encourage stakeholders to utilize health information exchange and certified health IT to effectively and efficiently help providers improve internal care delivery practices, engage patients in their care, support management of care across the continuum, enable the reporting of electronically specified clinical quality measures (eCQMs), and improve efficiencies and reduce unnecessary costs. As adoption of certified health IT increases and interoperability standards continue to mature, HHS will seek to reinforce standards through relevant policies and programs.

III. Proposed Provisions of the Home Health Prospective Payment System

A. Monitoring for Potential Impacts—Affordable Care Act Rebasing Adjustments

1. Analysis of FY 2013 HHA Cost Report Data

As part of our efforts in monitoring the potential impacts of the rebasing adjustments finalized in the CY 2014 HH PPS final rule (78 FR 72293), we continue to update our analysis of home health cost report and claims data. In the CY 2014 HH PPS final rule, using 2011 cost report and 2012 claims data, we estimated the 2013 60-day episode cost to be $2,565.51 (78 FR 72277). In that final rule, we stated that our analysis of 2011 cost report data and 2012 claims data indicated a need for a −3.45 percent rebasing adjustment to the national, standardized 60-day episode payment rate each year for four years. However, as specified by statute, the rebasing adjustment is limited to 3.5 percent of the CY 2010 national, standardized 60-day episode payment rate of $2,312.94 (74 FR 58106), or $80.95. We stated that given that a −3.45 percent adjustment for CY 2014 through CY 2017 would result in larger dollar amount reductions than the maximum dollar amount allowed under section 3131(a) of the Affordable Care Act of $80.95, we were limited to implementing a reduction of $80.95 (approximately 2.8 percent for CY 2014) to the national, standardized 60-day episode payment amount each year for CY 2014 through CY 2017.

In the CY 2015 HH PPS final rule, (79 FR 66032-66118) using 2012 cost report and 2013 claims data, we estimated the 2013 60-day episode cost to be $2,485.24 (79 FR 66037). Similar to our discussion in the CY 2014 HH PPS final rule, we stated that absent the Affordable Care Act's limit to rebasing, in order to align payments with costs, a −4.21 percent adjustment would have been applied to the national, standardized 60-day episode payment amount each year for CY 2014 through CY 2017.

For this proposed rule, we analyzed 2013 HHA cost report data and 2013 HHA claims data to determine whether the average cost per episode was higher using 2013 cost report data compared to the 2011 cost report and 2012 claims data used in calculating the rebasing adjustments. To determine the 2013 average cost per visit per discipline, we applied the same trimming methodology outlined in the CY 2014 HH PPS proposed rule (78 FR 40284) and weighted the costs per visit from the 2013 cost reports by size, facility type, and urban/rural location so the costs per visit were nationally representative according to 2013 claims data. The 2013 average number of visits was taken from 2013 claims data. We estimate the cost of a 60-day episode in CY 2013 to be $2,402.11 using 2013 cost report data (Table 3). Our latest analysis of 2013 cost report and 2013 claims data suggests that an even larger reduction (−5.02 percent) than the reduction described in the CY 2014 HH PPS final rule (−3.45 percent) or the reduction described in the CY 2015 HH PPS final rule (−4.21) would have been needed in order to align payments with costs.

Table 3—2013 Estimated Cost per Episode

Discipline 2013 average costs per visit 2013 average number of visits 2013 60-day episode costs
Skilled Nursing $131.43 9.28 $1,219.67
Home Health Aide 59.87 2.41 144.29
Physical Therapy 154.96 5.03 779.45
Occupational Therapy 154.11 1.22 188.01
Speech-Language Pathology 164.59 0.25 41.15
Medical Social Services 211.02 0.14 29.54
Total 18.33 2,402.11
Source: FY 2013 Medicare cost report data and 2013 Medicare claims data from the standard analytic file (as of June 30, 2014) for episodes (excluding low-utilization payment adjusted episodes and partial-episode-payment adjusted episodes) ending on or before December 31, 2013 for which we could link an OASIS assessment.

2. MedPAC Report to the Congress: Home Health Payment Rebasing

Section 3131(a) of the Affordable Care Act required the Medicare Payment Advisory Commission (MedPAC) to assess, by January 1, 2015, the impact of the mandated rebasing adjustments on quality of and beneficiary access to home health care. As part of this assessment, the statute required MedPAC to consider the impact on care delivered by rural, urban, nonprofit, and for-profit home health agencies. MedPAC's Report to Congress noted that the rebasing adjustments are partially offset by the payment update each year and across all four years of the phase-in of the rebasing adjustments the cumulative net reduction would equal about 2 percent. MedPAC concluded that, as a result of the payment update offsets to the rebasing adjustments, HHA margins are likely to remain high under the current rebasing policy and quality of care and beneficiary access to care are unlikely to be negatively affected.

Medicare Payment Advisory Commission (MedPAC), “Report to the Congress: Impact of Home Health Payment Rebasing on Beneficiary Access to and Quality of Care”. December 2014. Washington, DC. Accessed on 5/05/15 at: http://www.medpac.gov/documents/reports/dec14_homehealth_rebasing_report.pdf?sfvrsn=0 .

As we noted in the CY 2014 HH PPS final rule (78 FR 72291), MedPAC's past reviews of access to home health care found that access generally remained adequate during periods of substantial decline in the number of agencies. MedPAC stated that this is due in part to the low capital requirements for home health care services that allow the industry to react rapidly when the supply of agencies changes or contracts. As described in section III.A.3, the number of HHAs billing Medicare for home health services in CY 2013 is 80 percent higher than the number of HHAs billing Medicare for home health services in 2001. Even if some HHAs were to exit the program due to possible reimbursement concerns, the home health market would be expected to remain robust.

3. Analysis of CY 2014 HHA Claims Data

In the CY 2014 HH PPS final rule (78 FR 72256), some commenters expressed concern that the rebasing of the HH PPS payment rates would result in HHA closures and would therefore diminish access to home health services. In addition to examining more recent cost report data, for this proposed rule we examined home health claims data from the first year of the four-year phase-in of the rebasing adjustments (CY 2014), the first calendar year of the HH PPS (CY 2001), and claims data for the three years before implementation of the rebasing adjustments (CY 2011-2013). Preliminary analysis of CY 2014 home health claims data indicates that the number of episodes decreased by 3.8 percent between 2013 and 2014. In addition, the number of home health users decreased by approximately 3 percent between 2013 and 2014, while the number of FFS beneficiaries has remained the same. Between 2013 and 2014 there appears to be a net decrease in the number of HHAs billing Medicare for home health services of 1.6 percent, driven mostly by decreases TX and FL, two of the six states with the highest utilization of Medicare home health (see Table 3 and Table 4). The HHAs that no longer billed Medicare for home health services in CY 2014 typically served beneficiaries that were nearly twice as likely to be dually-eligible for both Medicare and Medicaid in CY 2013 compared to the national average for all HHAs in CY 2013. We note that in CY 2014 there were 3.0 HHAs per 10,000 FFS beneficiaries, the same number of HHAs per 10,000 FFS beneficiaries as there was in 2011, but markedly higher than the 1.9 HHAs per 10,000 FFS beneficiaries in 2001. If we were to exclude the six states with the highest home health utilization (see Table 5), the number of episodes amongst the remaining states (including Guam, Puerto Rico, and the Virgin Islands) decreased by 2.6 percent between 2013 and 2014, the number of home health users decreased by approximately 2.4 percent between 2013 and 2014, and the number of HHAs billing Medicare for home health services remained virtually the same (a net decrease of only 1 HHA).

We would note that preliminary data on hospital and skilled nursing facility discharges and days indicates that there was a decrease in hospital discharges of approximately 3 percent and a decrease in SNF days of approximately 2 percent in CY 2014. Any decreases in hospital discharges and skilled nursing facility days could, in turn, impact home health utilization as those settings serve as important sources of home health referrals.

Table 4—Home Health Statistics, CY 2001 and CY 2011 Through CY 2014

2001 2011 2012 2013 2014
Number of episodes 3,896,502 6,821,459 6,727,875 6,708,923 6,451,283
Beneficiaries receiving at least 1 episode (Home Health Users) 2,412,318 3,449,231 3,446,122 3,484,579 3,381,635
Part A and/or B FFS beneficiaries 34,899,167 37,686,526 38,224,640 38,505,609 38,506,534
Episodes per Part A and/or B FFS beneficiaries 0.11 0.18 0.18 0.17 0.17
Home health users as a percentage of Part A and/or B FFS beneficiaries 6.9% 9.2% 9.0% 9.0% 8.8%
HHAs providing at least 1 episode 6,511 11,446 11,746 11,889 11,693
HHAs per 10,000 Part A and/or B FFS beneficiaries 1.9 3.0 3.1 3.1 3.0
Source: National claims history (NCH) data obtained from Chronic Condition Warehouse (CCW)—Accessed on May 14, 2014 and August 19, 2014 for CY 2011, CY 2012, and CY 2013 data; and accessed on May 7, 2015 for CY 2001 and CY 2014 data. Medicare enrollment information obtained from the CCW Master Beneficiary Summary File. Beneficiaries are the total number of beneficiaries in a given year with at least 1 month of Part A and/or Part B Fee-for-Service coverage without having any months of Medicare Advantage coverage.
Note(s): These results include all episode types (Normal, PEP, Outlier, LUPA) and also include episodes from outlying areas (outside of 50 States and District of Columbia). Only episodes with a through date in the year specified are included. Episodes with a claim frequency code equal to “0” (“Non-payment/zero claims”) and “2” (“Interim—first claim”) are excluded. If a beneficiary is treated by providers from multiple states within a year the beneficiary is counted within each state's unique number of beneficiaries served.

For the six states (TX, LA, OK, MS, FL, and IL) with the highest utilization of Medicare home health (as measured by the number of episodes per Part A and/or Part B FFS beneficiaries), the number of episodes decreased by 5.7 percent, the number of home health users decreased by 4.3 percent, and the number of HHAs billing Medicare decreased by 3.7 percent (5,280-5,085) between 2013 and 2014 (see Table 5). A possible contributing factor to these decreases may be the temporary moratorium on the enrollment of new HHAs, effective July 31, 2013, for Miami, FL and Chicago, IL and the temporary moratorium on enrollment of new HHAs, effective February 4, 2014, for Fort Lauderdale, FL; Detroit, MI; Dallas, TX; and Houston, TX. The temporary moratoria on enrollment of new HHAs in Miami, FL; Chicago, IL; Fort Lauderdale, FL; Detroit, MI; Dallas, TX; and Houston, TX were extended for 6 months on August 1, 2014 and again for 6 months effective January 29, 2015 (80 FR 5551).

Table 5—Home Health Statistics for the States with the Highest Number of Home Health Episodes per Part A and/or Part B FFS Beneficiaries, CY 2001 and CY 2011 Through CY 2014

Year TX FL OK MS LA IL
Number of Episodes 2001 285,710 284,579 77,149 73,353 124,789 162,686
2011 1,107,605 701,426 203,112 153,983 249,479 433,117
2012 1,054,244 691,255 196,887 148,516 230,115 423,462
2013 995,555 689,269 196,713 143,428 215,590 421,309
2014 941,815 651,940 189,421 141,293 196,495 389,850
Beneficiaries Receiving at Least 1 Episode (Home Health Users) 2001 155,802 195,678 36,919 35,769 50,760 105,115
2011 363,474 355,900 67,218 55,818 77,677 192,921
2012 350,803 354,838 65,948 55,438 74,755 191,936
2013 333,396 357,099 66,502 55,453 73,888 191,961
2014 319,492 343,231 65,392 54,890 69,328 179,835
Part A and/or Part B FFS Beneficiaries 2001 2,132,310 2,246,313 480,556 436,751 528,287 1,543,158
2011 2,597,406 2,454,124 549,687 476,497 561,531 1,785,278
2012 2,604,458 2,451,790 558,500 480,218 568,483 1,812,241
2013 2,535,611 2,454,216 568,815 483,439 574,654 1,836,862
2014 2,564,292 2,464,748 580,267 491,482 575,832 1,674,935
Episodes per Part A and/or Part B FFS beneficiaries 2001 0.13 0.13 0.16 0.17 0.24 0.11
2011 0.43 0.29 0.37 0.32 0.44 0.24
2012 0.40 0.28 0.35 0.31 0.40 0.23
2013 0.39 0.28 0.35 0.30 0.38 0.23
2014 0.37 0.26 0.33 0.29 0.34 0.23
Home Health Users as a Percentage of Part A and/or Part B FFS Beneficiaries 2001 7.3% 8.7% 7.7% 8.2% 9.6% 6.8%
2011 14.0% 14.5% 12.2% 11.7% 13.8% 10.8%
2012 13.5% 14.5% 11.8% 11.5% 13.2% 10.6%
2013 13.2% 14.6% 11.7% 11.5% 12.9% 10.5%
2014 12.5% 13.9% 11.3% 11.2% 12.0% 10.7%
HHAs Providing at Least 1 Episode 2001 799 330 180 61 242 273
2011 2,472 1,426 252 51 216 743
2012 2,549 1,430 254 48 213 783
2013 2,600 1,357 262 48 210 803
2014 2,558 1,230 262 46 205 784
HHAs per 10,000 Part A and/or B FFS beneficiaries 2001 3.7 1.5 3.7 1.4 4.6 1.8
2011 9.5 5.8 4.6 1.1 3.8 4.2
2012 9.8 5.8 4.5 1.0 3.7 4.3
2013 10.3 5.5 4.6 1.0 3.7 4.4
2014 10.0 5.0 4.5 0.9 3.6 4.7
Source: National claims history (NCH) data obtained from Chronic Condition Warehouse (CCW)—Accessed on May 14, 2014 and August 19, 2014 for CY 2011, CY 2012, and CY 2013 data; and accessed on May 7, 2015 for CY 2001 and CY 2014 data. Medicare enrollment information obtained from the CCW Master Beneficiary Summary File. Beneficiaries are the total number of beneficiaries in a given year with at least 1 month of Part A and/or Part B Fee-for-Service coverage without having any months of Medicare Advantage coverage.
Note(s): These results include all episode types (Normal, PEP, Outlier, LUPA) and also include episodes from outlying areas (outside of 50 States and District of Columbia). Only episodes with a through date in the year specified are included. Episodes with a claim frequency code equal to “0” (“Non-payment/zero claims”) and “2” (“Interim—first claim”) are excluded. If a beneficiary is treated by providers from multiple states within a year the beneficiary is counted within each state's unique number of beneficiaries served.

In addition to examining home health claims data from the first year of the implementation of rebasing adjustments required by the Affordable Care Act and comparing utilization in that year (CY 2014) to the three years prior and to the first calendar year following the implementation of the HH PPS (CY 2001), we subsequently examined trends in home health utilization for all years starting in CY 2001 and up through CY 2014. Figure 1, displays the average number of visits per 60-day episode of care and the average payment per visit. While the average payment per visit has steadily increased from approximately $116 in CY 2001 to $162 for CY 2014, the average total number of visits per 60-day episode of care has declined, most notably between CY 2009 (21.7 visits per episode) and CY 2014 (18.0 visit per episode). As noted in section II.C, we implemented a series of reductions to the national, standardized 60-day episode payment rate to account for increases in nominal case-mix, starting in CY 2008. The reductions to the 60-day episode rate were: 2.75 percent each year for CY 2008, CY 2009, and CY 2010; 3.79 percent for CY 2011and CY 2012; and a 1.32 percent payment reduction for CY 2013. Figure 2 displays the average number of visits by discipline type for a 60-day episode of care and shows that while the number of therapy visits per 60-day episode of care has increased slightly, the number of skilled nursing and home health aide visits have decreased, between CY 2009 and CY 2014. Section III.F describes the results of the home health study required by section 3131(d) of the Affordable Care Act, which suggests that the current home health payment system may discourage HHAs from serving patients with clinically complex and/or poorly controlled chronic conditions who do not qualify for therapy but require a large number of skilled nursing visits. The home health study results seems to be consistent with the recent trend in the decreased number of visits per episode of care driven by decreases in skilled nursing and home health aide services evident in Figures 1 and 2.

We will continue to monitor for potential impacts due to rebasing adjustments required by section 3131(a) of the Affordable Care Act and other policy changes in the future. Independent effects of any one policy may be difficult to discern in years where multiple policy changes occur in any given year.

B. CY 2016 HH PPS Case-Mix Weights and Proposed Reduction to the National, Standardized 60-Day Episode Payment Rate To Account for Nominal Case-Mix Growth

1. CY 2016 HH PPS Case-Mix Weights

For CY 2014, as part of the rebasing effort mandated by the Affordable Care Act, we reset the HH PPS case-mix weights, lowering the average case-mix weight to 1.0000. To lower the HH PPS case-mix weights to 1.0000, each HH PPS case-mix weight was decreased by the same factor (1.3464), thereby maintaining the same relative values between the weights. This “resetting” of the HH PPS case-mix weights was done in a budget neutral manner by inflating the national, standardized 60-day episode rate by the same factor (1.3464) that was used to decrease the weights. For CY 2015, we finalized a policy to annually recalibrate the HH PPS case-mix weights—adjusting the weights relative to one another—using the most current, complete data available. To recalibrate the HH PPS case-mix weights for CY 2016, we propose to use the same methodology finalized in the CY 2008 HH PPS final rule (72 FR 49762), the CY 2012 HH PPS final rule (76 FR 68526), and the CY 2015 HH PPS final rule (79 FR 66032). Annual recalibration of the HH PPS case-mix weights ensures that the case-mix weights reflect, as accurately as possible, current home health resource use and changes in utilization patterns.

To generate the proposed CY 2016 HH PPS case-mix weights, we used CY 2014 home health claims data (as of December 31, 2014) with linked OASIS data. These data are the most current and complete data available at this time. We will use CY 2014 home health claims data (as of June 30, 2015) with linked OASIS data to generate the CY 2016 HH PPS case-mix weights in the CY 2016 HH PPS final rule. The process we used to calculate the HH PPS case-mix weights are outlined below.

Step 1: Re-estimate the four-equation model to determine the clinical and functional points for an episode using wage-weighted minutes of care as our dependent variable for resource use. The wage-weighted minutes of care are determined using the CY 2013 Bureau of Labor Statistics national hourly wage plus fringe rates for the six home health disciplines and the minutes per visit from the claim. The points for each of the variables for each leg of the model, updated with CY 2014 data, are shown in Table 6. The points for the clinical variables are added together to determine an episode's clinical score. The points for the functional variables are added together to determine an episode's functional score.

TABLE 6—Case-Mix Adjustment Variables and Scores

Episode number within sequence of adjacent episodes 1 or 2 1 or 2 3+ 3+
Therapy visits 0-13 14+ 0-13 14+
EQUATION: 1 2 3 4
CLINICAL DIMENSION
1 Primary or Other Diagnosis = Blindness/Low Vision
2 Primary or Other Diagnosis = Blood disorders 6 2
3 Primary or Other Diagnosis = Cancer, selected benign neoplasms 7 7
4 Primary Diagnosis = Diabetes 7 4
5 Other Diagnosis = Diabetes 1
6 Primary or Other Diagnosis = Dysphagia 3 15 1 8
AND
Primary or Other Diagnosis = Neuro 3—Stroke
7 Primary or Other Diagnosis = Dysphagia 1 9 1 9
AND
M1030 (Therapy at home) = 3 (Enteral)
8 Primary or Other Diagnosis = Gastrointestinal disorders
9 Primary or Other Diagnosis = Gastrointestinal disorders 6 6
AND
M1630 (ostomy) = 1 or 2
10 Primary or Other Diagnosis = Gastrointestinal disorders
AND Primary or Other Diagnosis = Neuro 1—Brain disorders and paralysis, OR Neuro 2—Peripheral neurological disorders, OR Neuro 3—Stroke, OR Neuro 4—Multiple Sclerosis
11 Primary or Other Diagnosis = Heart Disease OR Hypertension 1
12 Primary Diagnosis = Neuro 1—Brain disorders and paralysis 3 11 7 11
13 Primary or Other Diagnosis = Neuro 1—Brain disorders and paralysis 2 2
AND
M1840 (Toilet transfer) = 2 or more
14 Primary or Other Diagnosis = Neuro 1—Brain disorders and paralysis OR Neuro 2—Peripheral neurological disorders 2 7 1 5
AND
M1810 or M1820 (Dressing upper or lower body) = 1, 2, or 3
15 Primary or Other Diagnosis = Neuro 3—Stroke 3 9 2 2
16 Primary or Other Diagnosis = Neuro 3—Stroke AND 4 4
M1810 or M1820 (Dressing upper or lower body) = 1, 2, or 3
17 Primary or Other Diagnosis = Neuro 3—Stroke
AND
M1860 (Ambulation) = 4 or more
18 Primary or Other Diagnosis = Neuro 4—Multiple Sclerosis AND AT LEAST ONE OF THE FOLLOWING: 3 10 7 10
M1830 (Bathing) = 2 or more
OR
M1840 (Toilet transfer) = 2 or more
OR
M1850 (Transferring) = 2 or more
OR
M1860 (Ambulation) = 4 or more
19 Primary or Other Diagnosis = Ortho 1—Leg Disorders or Gait Disorders 8 1 8 1
AND
M1324 (most problematic pressure ulcer stage) = 1, 2, 3 or 4
20 Primary or Other Diagnosis = Ortho 1—Leg OR Ortho 2—Other orthopedic disorders 3 3 6
AND
M1030 (Therapy at home) = 1 (IV/Infusion) or 2 (Parenteral)
21 Primary or Other Diagnosis = Psych 1—Affective and other psychoses, depression
22 Primary or Other Diagnosis = Psych 2—Degenerative and other organic psychiatric disorders
23 Primary or Other Diagnosis = Pulmonary disorders
24 Primary or Other Diagnosis = Pulmonary disorders AND M1860 (Ambulation) = 1 or more
25 Primary Diagnosis = Skin 1—Traumatic wounds, burns, and post-operative complications 4 19 8 19
26 Other Diagnosis = Skin 1—Traumatic wounds, burns, post-operative complications 6 15 8 13
27 Primary or Other Diagnosis = Skin 1—Traumatic wounds, burns, and post-operative complications OR Skin 2—Ulcers and other skin conditions 3
AND
M1030 (Therapy at home) = 1 (IV/Infusion) or 2 (Parenteral)
28 Primary or Other Diagnosis = Skin 2—Ulcers and other skin conditions 2 17 8 17
29 Primary or Other Diagnosis = Tracheostomy 2 16 2 16
30 Primary or Other Diagnosis = Urostomy/Cystostomy 19 11
31 M1030 (Therapy at home) = 1 (IV/Infusion) or 2 (Parenteral) 1 18 6 14
32 M1030 (Therapy at home) = 3 (Enteral) 14 5
33 M1200 (Vision) = 1 or more
34 M1242 (Pain) = 3 or 4 2 1
35 M1308 = Two or more pressure ulcers at stage 3 or 4 5 5 5 14
36 M1324 (Most problematic pressure ulcer stage) = 1 or 2 4 19 7 16
37 M1324 (Most problematic pressure ulcer stage) = 3 or 4 8 32 11 26
38 M1334 (Stasis ulcer status) = 2 4 12 8 12
39 M1334 (Stasis ulcer status) = 3 7 17 10 17
40 M1342 (Surgical wound status) = 2 2 7 5 13
41 M1342 (Surgical wound status) = 3 1 7 5 7
42 M1400 (Dyspnea) = 2, 3, or 4 1 1
43 M1620 (Bowel Incontinence) = 2 to 5 4 4
44 M1630 (Ostomy) = 1 or 2 4 12 2 7
45 M2030 (Injectable Drug Use) = 0, 1, 2, or 3
FUNCTIONAL DIMENSION
46 M1810 or M1820 (Dressing upper or lower body) = 1, 2, or 3 2 1
47 M1830 (Bathing) = 2 or more 6 2 5
48 M1840 (Toilet transferring) = 2 or more 1 4 1 1
49 M1850 (Transferring) = 2 or more 3 2 1
50 M1860 (Ambulation) = 1, 2 or 3 7 4
51 M1860 (Ambulation) = 4 or more 7 9 6 7
Source: CY 2014 Medicare claims data for episodes ending on or before December 31, 2014 (as of December 31, 2014) for which we had a linked OASIS assessment. LUPA episodes, outlier episodes, and episodes with SCIC or PEP adjustments were excluded.
Note(s): Points are additive; however, points may not be given for the same line item in the table more than once. Please see Medicare Home Health Diagnosis Coding guidance at: http://www.cms.hhs.gov/HomeHealthPPS/03_coding&billing.asp for definitions of primary and secondary diagnoses.

In updating the four-equation model for CY 2016, using 2014 data (the last update to the four-equation model for CY 2015 used 2013 data), there were few changes to the point values for the variables in the four-equation model. These relatively minor changes reflect the change in the relationship between the grouper variables and resource use between 2013 and 2014. The CY 2016 four-equation model resulted in 130 point-giving variables being used in the model (as compared to the 124 variables for the 2015 recalibration). There were nine variables that were added to the model and three variables that were dropped from the model due to the absence of additional resources associated with the variable. The points for 18 variables increased in the CY 2016 four-equation model and the points for 43 variables decreased in the CY 2016 4-equation model. There were 58 variables with the same point values.

Step 2: Re-defining the clinical and functional thresholds so they are reflective of the new points associated with the CY 2016 four-equation model. After estimating the points for each of the variables and summing the clinical and functional points for each episode, we look at the distribution of the clinical score and functional score, breaking the episodes into different steps. The categorizations for the steps are as follows:

  • Step 1: First and second episodes, 0-13 therapy visits.
  • Step 2.1: First and second episodes, 14-19 therapy visits.
  • Step 2.2: Third episodes and beyond, 14-19 therapy visits.
  • Step 3: Third episodes and beyond, 0-13 therapy visits.
  • Step 4: Episodes with 20+ therapy visits

We then divide the distribution of the clinical score for episodes within a step such that a third of episodes are classified as low clinical score, a third of episodes are classified as medium clinical score, and a third of episodes are classified as high clinical score. The same approach is then done looking at the functional score. It was not always possible to evenly divide the episodes within each step into thirds due to many episodes being clustered around one particular score. Also, we looked at the average resource use associated with each clinical and functional score and used that to guide where we placed our thresholds. We tried to group scores with similar average resource use within the same level (even if it meant that more or less than a third of episodes were placed within a level). The new thresholds, based off of the CY 2016 four-equation model points are shown in Table 7.

For Step 1, 54% of episodes were in the medium functional level (All with score 15).

For Step 2.1, 77.2% of episodes were in the low functional level (Most with score 2 and 4).

For Step 2.2, 67.1% of episodes were in the low functional level (All with score 0).

For Step 3, 60.9% of episodes were in the medium functional level (Most with score 10).

For Step 4, 49.8% of episodes were in the low functional level (Most with score 2).

Table 7—CY 2016 Clinical and Functional Thresholds

1st and 2nd episodes 3rd+ episodes All Episodes
0 to 13 therapy visits 14 to 19 therapy visits 0 to 13 therapy visits 14 to 19 therapy visits 20+ therapy visits
Grouping Step: 1 2.1 3 2.2 4
Equation(s) used to calculate points: (see Table 6) 1 2 3 4 (2&4)
Dimension: Severity Level:
Clinical C1 0 to 1 0 0 0 to 3 0 to 3
C2 2 to 3 1 to 7 1 4 to 12 4 to 16
C3 4+ 8+ 2+ 13+ 17+
Functional F1 0 to 14 0 to 6 0 to 6 0 0 to 2
F2 15 7 to 13 7 to10 1 to 7 3 to 6
F3 16+ 14+ 11+ 8+ 7+

Step 3: Once the clinical and functional thresholds are determined and each episode is assigned a clinical and functional level, the payment regression is estimated with an episode's wage-weighted minutes of care as the dependent variable. Independent variables in the model are indicators for the step of the episode as well as the clinical and functional levels within each step of the episode. Like the four-equation model, the payment regression model is also estimated with robust standard errors that are clustered at the beneficiary level. Table 8 shows the regression coefficients for the variables in the payment regression model updated with CY 2014 data. The R-squared value for the payment regression model is 0.4790 (an increase from 0.4680 for the CY 2015 recalibration).

TABLE 8—Payment Regression Model

Variable Description New payment regression coefficients
Step 1, Clinical Score Medium $23.43
Step 1, Clinical Score High 57.50
Step 1, Functional Score Medium 73.18
Step 1, Functional Score High 110.39
Step 2.1, Clinical Score Medium 42.51
Step 2.1, Clinical Score High 163.27
Step 2.1, Functional Score Medium 34.24
Step 2.1, Functional Score High 88.01
Step 2.2, Clinical Score Medium 58.37
Step 2.2, Clinical Score High 210.67
Step 2.2, Functional Score Medium 10.64
Step 2.2, Functional Score High 65.24
Step 3, Clinical Score Medium 9.87
Step 3, Clinical Score High 89.22
Step 3, Functional Score Medium 53.47
Step 3, Functional Score High 83.07
Step 4, Clinical Score Medium 70.04
Step 4, Clinical Score High 231.22
Step 4, Functional Score Medium 14.07
Step 4, Functional Score High 63.20
Step 2.1, 1st and 2nd Episodes, 14 to 19 Therapy Visits 444.92
Step 2.2, 3rd+ Episodes, 14 to 19 Therapy Visits 485.03
Step 3, 3rd+ Episodes, 0-13 Therapy Visits −73.86
Step 4, All Episodes, 20+ Therapy Visits 889.81
Intercept 378.68
Source: CY 2014 Medicare claims data for episodes ending on or before December 31, 2014 (as of December 31, 2014) for which we had a linked OASIS assessment.

Step 4: We use the coefficients from the payment regression model to predict each episode's wage-weighted minutes of care (resource use). We then divide these predicted values by the mean of the dependent variable (that is, the average wage-weighted minutes of care across all episodes used in the payment regression). This division constructs the weight for each episode, which is simply the ratio of the episode's predicted wage-weighted minutes of care divided by the average wage-weighted minutes of care in the sample. Each episode is then aggregated into one of the 153 home health resource groups (HHRGs) and the “raw” weight for each HHRG was calculated as the average of the episode weights within the HHRG.

Step 5: The weights associated with 0 to 5 therapy visits are then increased by 3.75 percent, the weights associated with 14-15 therapy visits are decreased by 2.5 percent, and the weights associated with 20+ therapy visits are decreased by 5 percent. These adjustments to the case-mix weights were finalized in the CY 2012 HH PPS final rule (76 FR 68557) and were done to address MedPAC's concerns that the HH PPS overvalues therapy episodes and undervalues non-therapy episodes and to better aligned the case-mix weights with episode costs estimated from cost report data.

Medicare Payment Advisory Commission (MedPAC ), Report to the Congress: Medicare Payment Policy. March 2011, P. 176.

Step 6: After the adjustments in step 5 are applied to the raw weights, the weights are further adjusted to create an increase in the payment weights for the therapy visit steps between the therapy thresholds. Weights with the same clinical severity level, functional severity level, and early/later episode status were grouped together. Then within those groups, the weights for each therapy step between thresholds are gradually increased. We do this by interpolating between the main thresholds on the model (from 0-5 to 14-15 therapy visits, and from 14-15 to 20+ therapy visits). We use a linear model to implement the interpolation so the payment weight increase for each step between the thresholds (such as the increase between 0-5 therapy visits and 6 therapy visits and the increase between 6 therapy visits and 7-9 therapy visits) are constant. This interpolation is the identical to the process finalized in the CY 2012 HH PPS final rule (76 FR 68555).

Step 7: The interpolated weights are then adjusted so that the average case-mix for the weights is equal to 1.0000. This last step creates the CY 2016 case-mix weights shown in Table 9.

When computing the average, we compute a weighted average, assigning a value of one to each normal episode and a value equal to the episode length divided by 60 for PEPs.

Table 9—CY 2016 Case-Mix Payment Weights

Payment group Step (episode and/or therapy visit ranges) Clinical and functional levels (1 = low; 2 = medium; 3= high) CY 2016 case-mix weights
10111 1st and 2nd Episodes, 0 to 5 Therapy Visits C1F1S1 0.5969
10112 1st and 2nd Episodes, 6 Therapy Visits C1F1S2 0.7216
10113 1st and 2nd Episodes, 7 to 9 Therapy Visits C1F1S3 0.8462
10114 1st and 2nd Episodes, 10 Therapy Visits C1F1S4 0.9708
10115 1st and 2nd Episodes, 11 to 13 Therapy Visits C1F1S5 1.0954
10121 1st and 2nd Episodes, 0 to 5 Therapy Visits C1F2S1 1.2201
10122 1st and 2nd Episodes, 6 Therapy Visits C1F2S2 1.4237
10123 1st and 2nd Episodes, 7 to 9 Therapy Visits C1F2S3 1.6273
10124 1st and 2nd Episodes, 10 Therapy Visits C1F2S4 0.7123
10125 1st and 2nd Episodes, 11 to 13 Therapy Visits C1F2S5 0.8240
10131 1st and 2nd Episodes, 0 to 5 Therapy Visits C1F3S1 0.9357
10132 1st and 2nd Episodes, 6 Therapy Visits C1F3S2 1.0474
10133 1st and 2nd Episodes, 7 to 9 Therapy Visits C1F3S3 1.1591
10134 1st and 2nd Episodes, 10 Therapy Visits C1F3S4 1.2708
10135 1st and 2nd Episodes, 11 to 13 Therapy Visits C1F3S5 1.4643
10211 1st and 2nd Episodes, 0 to 5 Therapy Visits C2F1S1 1.6578
10212 1st and 2nd Episodes, 6 Therapy Visits C2F1S2 0.7709
10213 1st and 2nd Episodes, 7 to 9 Therapy Visits C2F1S3 0.8868
10214 1st and 2nd Episodes, 10 Therapy Visits C2F1S4 1.0027
10215 1st and 2nd Episodes, 11 to 13 Therapy Visits C2F1S5 1.1186
10221 1st and 2nd Episodes, 0 to 5 Therapy Visits C2F2S1 1.2345
10222 1st and 2nd Episodes, 6 Therapy Visits C2F2S2 1.3504
10223 1st and 2nd Episodes, 7 to 9 Therapy Visits C2F2S3 1.5410
10224 1st and 2nd Episodes, 10 Therapy Visits C2F2S4 1.7316
10225 1st and 2nd Episodes, 11 to 13 Therapy Visits C2F2S5 0.6339
10231 1st and 2nd Episodes, 0 to 5 Therapy Visits C2F3S1 0.7637
10232 1st and 2nd Episodes, 6 Therapy Visits C2F3S2 0.8935
10233 1st and 2nd Episodes, 7 to 9 Therapy Visits C2F3S3 1.0234
10234 1st and 2nd Episodes, 10 Therapy Visits C2F3S4 1.1532
10235 1st and 2nd Episodes, 11 to 13 Therapy Visits C2F3S5 1.2830
10311 1st and 2nd Episodes, 0 to 5 Therapy Visits C3F1S1 1.4994
10312 1st and 2nd Episodes, 6 Therapy Visits C3F1S2 1.7157
10313 1st and 2nd Episodes, 7 to 9 Therapy Visits C3F1S3 0.7492
10314 1st and 2nd Episodes, 10 Therapy Visits C3F1S4 0.8661
10315 1st and 2nd Episodes, 11 to 13 Therapy Visits C3F1S5 0.9830
10321 1st and 2nd Episodes, 0 to 5 Therapy Visits C3F2S1 1.0999
10322 1st and 2nd Episodes, 6 Therapy Visits C3F2S2 1.2169
10323 1st and 2nd Episodes, 7 to 9 Therapy Visits C3F2S3 1.3338
10324 1st and 2nd Episodes, 10 Therapy Visits C3F2S4 1.5400
10325 1st and 2nd Episodes, 11 to 13 Therapy Visits C3F2S5 1.7461
10331 1st and 2nd Episodes, 0 to 5 Therapy Visits C3F3S1 0.8079
10332 1st and 2nd Episodes, 6 Therapy Visits C3F3S2 0.9290
10333 1st and 2nd Episodes, 7 to 9 Therapy Visits C3F3S3 1.0501
10334 1st and 2nd Episodes, 10 Therapy Visits C3F3S4 1.1712
10335 1st and 2nd Episodes, 11 to 13 Therapy Visits C3F3S5 1.2923
21111 1st and 2nd Episodes, 14 to 15 Therapy Visits C1F1S1 1.4134
21112 1st and 2nd Episodes, 16 to 17 Therapy Visits C1F1S2 1.6167
21113 1st and 2nd Episodes, 18 to 19 Therapy Visits C1F1S3 1.8200
21121 1st and 2nd Episodes, 14 to 15 Therapy Visits C1F2S1 0.6876
21122 1st and 2nd Episodes, 16 to 17 Therapy Visits C1F2S2 0.8424
21123 1st and 2nd Episodes, 18 to 19 Therapy Visits C1F2S3 0.9973
21131 1st and 2nd Episodes, 14 to 15 Therapy Visits C1F3S1 1.1522
21132 1st and 2nd Episodes, 16 to 17 Therapy Visits C1F3S2 1.3071
21133 1st and 2nd Episodes, 18 to 19 Therapy Visits C1F3S3 1.4619
21211 1st and 2nd Episodes, 14 to 15 Therapy Visits C2F1S1 1.6962
21212 1st and 2nd Episodes, 16 to 17 Therapy Visits C2F1S2 1.9304
21213 1st and 2nd Episodes, 18 to 19 Therapy Visits C2F1S3 0.8029
21221 1st and 2nd Episodes, 14 to 15 Therapy Visits C2F2S1 0.9449
21222 1st and 2nd Episodes, 16 to 17 Therapy Visits C2F2S2 1.0868
21223 1st and 2nd Episodes, 18 to 19 Therapy Visits C2F2S3 1.2288
21231 1st and 2nd Episodes, 14 to 15 Therapy Visits C2F3S1 1.3707
21232 1st and 2nd Episodes, 16 to 17 Therapy Visits C2F3S2 1.5127
21233 1st and 2nd Episodes, 18 to 19 Therapy Visits C2F3S3 1.7368
21311 1st and 2nd Episodes, 14 to 15 Therapy Visits C3F1S1 1.9609
21312 1st and 2nd Episodes, 16 to 17 Therapy Visits C3F1S2 0.8616
21313 1st and 2nd Episodes, 18 to 19 Therapy Visits C3F1S3 1.0077
21321 1st and 2nd Episodes, 14 to 15 Therapy Visits C3F2S1 1.1539
21322 1st and 2nd Episodes, 16 to 17 Therapy Visits C3F2S2 1.3000
21323 1st and 2nd Episodes, 18 to 19 Therapy Visits C3F2S3 1.4462
21331 1st and 2nd Episodes, 14 to 15 Therapy Visits C3F3S1 1.5923
21332 1st and 2nd Episodes, 16 to 17 Therapy Visits C3F3S2 1.8135
21333 1st and 2nd Episodes, 18 to 19 Therapy Visits C3F3S3 2.0347
22111 3rd+ Episodes, 14 to 15 Therapy Visits C1F1S1 0.4805
22112 3rd+ Episodes, 16 to 17 Therapy Visits C1F1S2 0.6403
22113 3rd+ Episodes, 18 to 19 Therapy Visits C1F1S3 0.8001
22121 3rd+ Episodes, 14 to 15 Therapy Visits C1F2S1 0.9599
22122 3rd+ Episodes, 16 to 17 Therapy Visits C1F2S2 1.1197
22123 3rd+ Episodes, 18 to 19 Therapy Visits C1F2S3 1.2795
22131 3rd+ Episodes, 14 to 15 Therapy Visits C1F3S1 1.4633
22132 3rd+ Episodes, 16 to 17 Therapy Visits C1F3S2 1.6471
22133 3rd+ Episodes, 18 to 19 Therapy Visits C1F3S3 1.8309
22211 3rd+ Episodes, 14 to 15 Therapy Visits C2F1S1 0.5648
22212 3rd+ Episodes, 16 to 17 Therapy Visits C2F1S2 0.7109
22213 3rd+ Episodes, 18 to 19 Therapy Visits C2F1S3 0.8570
22221 3rd+ Episodes, 14 to 15 Therapy Visits C2F2S1 1.0031
22222 3rd+ Episodes, 16 to 17 Therapy Visits C2F2S2 1.1492
22223 3rd+ Episodes, 18 to 19 Therapy Visits C2F2S3 1.2952
22231 3rd+ Episodes, 14 to 15 Therapy Visits C2F3S1 1.4806
22232 3rd+ Episodes, 16 to 17 Therapy Visits C2F3S2 1.6659
22233 3rd+ Episodes, 18 to 19 Therapy Visits C2F3S3 1.8512
22311 3rd+ Episodes, 14 to 15 Therapy Visits C3F1S1 0.6114
22312 3rd+ Episodes, 16 to 17 Therapy Visits C3F1S2 0.7644
22313 3rd+ Episodes, 18 to 19 Therapy Visits C3F1S3 0.9173
22321 3rd+ Episodes, 14 to 15 Therapy Visits C3F2S1 1.0703
22322 3rd+ Episodes, 16 to 17 Therapy Visits C3F2S2 1.2232
22323 3rd+ Episodes, 18 to 19 Therapy Visits C3F2S3 1.3761
22331 3rd+ Episodes, 14 to 15 Therapy Visits C3F3S1 1.5581
22332 3rd+ Episodes, 16 to 17 Therapy Visits C3F3S2 1.7401
22333 3rd+ Episodes, 18 to 19 Therapy Visits C3F3S3 1.9222
30111 3rd+ Episodes, 0 to 5 Therapy Visits C1F1S1 0.4961
30112 3rd+ Episodes, 6 Therapy Visits C1F1S2 0.6700
30113 3rd+ Episodes, 7 to 9 Therapy Visits C1F1S3 0.8440
30114 3rd+ Episodes, 10 Therapy Visits C1F1S4 1.0180
30115 3rd+ Episodes, 11 to 13 Therapy Visits C1F1S5 1.1920
30121 3rd+ Episodes, 0 to 5 Therapy Visits C1F2S1 1.3660
30122 3rd+ Episodes, 6 Therapy Visits C1F2S2 1.5546
30123 3rd+ Episodes, 7 to 9 Therapy Visits C1F2S3 1.7433
30124 3rd+ Episodes, 10 Therapy Visits C1F2S4 1.9320
30125 3rd+ Episodes, 11 to 13 Therapy Visits C1F2S5 0.5803
30131 3rd+ Episodes, 0 to 5 Therapy Visits C1F3S1 0.7406
30132 3rd+ Episodes, 6 Therapy Visits C1F3S2 0.9009
30133 3rd+ Episodes, 7 to 9 Therapy Visits C1F3S3 1.0612
30134 3rd+ Episodes, 10 Therapy Visits C1F3S4 1.2214
30135 3rd+ Episodes, 11 to 13 Therapy Visits C1F3S5 1.3817
30211 3rd+ Episodes, 0 to 5 Therapy Visits C2F1S1 1.5719
30212 3rd+ Episodes, 6 Therapy Visits C2F1S2 1.7621
30213 3rd+ Episodes, 7 to 9 Therapy Visits C2F1S3 1.9523
30214 3rd+ Episodes, 10 Therapy Visits C2F1S4 0.6270
30215 3rd+ Episodes, 11 to 13 Therapy Visits C2F1S5 0.7941
30221 3rd+ Episodes, 0 to 5 Therapy Visits C2F2S1 0.9612
30222 3rd+ Episodes, 6 Therapy Visits C2F2S2 1.1284
30223 3rd+ Episodes, 7 to 9 Therapy Visits C2F2S3 1.2955
30224 3rd+ Episodes, 10 Therapy Visits C2F2S4 1.4626
30225 3rd+ Episodes, 11 to 13 Therapy Visits C2F2S5 1.6495
30231 3rd+ Episodes, 0 to 5 Therapy Visits C2F3S1 1.8364
30232 3rd+ Episodes, 6 Therapy Visits C2F3S2 2.0233
30233 3rd+ Episodes, 7 to 9 Therapy Visits C2F3S3 0.6211
30234 3rd+ Episodes, 10 Therapy Visits C2F3S4 0.8152
30235 3rd+ Episodes, 11 to 13 Therapy Visits C2F3S5 1.0093
30311 3rd+ Episodes, 0 to 5 Therapy Visits C3F1S1 1.2034
30312 3rd+ Episodes, 6 Therapy Visits C3F1S2 1.3975
30313 3rd+ Episodes, 7 to 9 Therapy Visits C3F1S3 1.5916
30314 3rd+ Episodes, 10 Therapy Visits C3F1S4 1.7826
30315 3rd+ Episodes, 11 to 13 Therapy Visits C3F1S5 1.9736
30321 3rd+ Episodes, 0 to 5 Therapy Visits C3F2S1 2.1647
30322 3rd+ Episodes, 6 Therapy Visits C3F2S2 0.7054
30323 3rd+ Episodes, 7 to 9 Therapy Visits C3F2S3 0.8858
30324 3rd+ Episodes, 10 Therapy Visits C3F2S4 1.0662
30325 3rd+ Episodes, 11 to 13 Therapy Visits C3F2S5 1.2466
30331 3rd+ Episodes, 0 to 5 Therapy Visits C3F3S1 1.4269
30332 3rd+ Episodes, 6 Therapy Visits C3F3S2 1.6073
30333 3rd+ Episodes, 7 to 9 Therapy Visits C3F3S3 1.7999
30334 3rd+ Episodes, 10 Therapy Visits C3F3S4 1.9924
30335 3rd+ Episodes, 11 to 13 Therapy Visits C3F3S5 2.1850
40111 All Episodes, 20+ Therapy Visits C1F1S1 0.7521
40121 All Episodes, 20+ Therapy Visits C1F2S1 0.9393
40131 All Episodes, 20+ Therapy Visits C1F3S1 1.1265
40211 All Episodes, 20+ Therapy Visits C2F1S1 1.3138
40221 All Episodes, 20+ Therapy Visits C2F2S1 1.5010
40231 All Episodes, 20+ Therapy Visits C2F3S1 1.6882
40311 All Episodes, 20+ Therapy Visits C3F1S1 1.8774
40321 All Episodes, 20+ Therapy Visits C3F2S1 2.0667
40331 All Episodes, 20+ Therapy Visits C3F3S1 2.2559

To ensure the changes to the HH PPS case-mix weights are implemented in a budget neutral manner, we would apply a case-mix budget neutrality factor to the CY 2016 national, standardized 60-day episode payment rate (see section III.B.1. of this proposed rule). The case-mix budget neutrality factor is calculated as the ratio of total payments when the CY 2016 HH PPS case-mix weights (developed using CY 2014 claims data) are applied to CY 2014 utilization (claims) data to total payments when CY 2015 HH PPS case-mix weights (developed using CY 2013 claims data) are applied to CY 2014 utilization data. This produces a case-mix budget neutrality factor for CY 2016 of 1.0141, based on CY 2014 claims data as of December 31, 2014.

2. Proposed Reduction to the National, Standardized 60-Day Episode Payment Rate To Account for Nominal Case-Mix Growth

Section 1895(b)(3)(B)(iv) of the Act gives the Secretary the authority to implement payment reductions for nominal case-mix growth (that is, case-mix growth unrelated to changes in patient acuity). Previously, we accounted for nominal case-mix growth through case-mix reductions implemented from 2008 through 2013 (76 FR 68528-68543). As stated in the 2013 final rule, the goal of the reductions for nominal case-mix growth is to better align payment with real changes in patient severity (77 FR 67077). Our analysis of data from CY 2000 through CY 2010 found that only 15.97 percent of the total case-mix change was real and 84.03 percent of total case-mix change was nominal (77 FR 41553). In the CY 2015 HH PPS final rule (79 FR 66032), we estimated that total case-mix increased by 2.76 percent between CY 2012 and CY 2013 and of that amount, we estimated that 2.32 percent was a result of nominal case-mix growth (2.76 − (2.76 × 0.1597)). However, for 2015, we did not implement a reduction to the 2015 national, standardized 60-day episode payment amount to account for nominal case-mix growth, but stated that we would continue to monitor case-mix growth and may consider proposing nominal case-mix reductions in the future. Since the publication of the CY 2015 HH PPS final rule (79 FR 66032), MedPAC reported on their assessment of the impact of the mandated rebasing adjustments on quality of and beneficiary access to home health care as required by section 3131(a) of the Affordable Care Act. As noted in section III.A.2 of this proposed rule, MedPAC concluded that quality of care and beneficiary access to care are unlikely to be negatively affected by the rebasing adjustments. We further estimate that case-mix increased by an additional 1.41 percent between CY 2013 and CY 2014 (as evidenced by the budget neutrality factor of 1.0141 percent described in section III.B.1 above). In applying the 15.97 percent estimate of real case-mix growth to the total estimated case-mix growth from CY 2013 to CY 2014 (1.41 percent), we estimate that case-mix increased by 1.18 percent (1.41 − (1.41 × 0.1597)) as a result of nominal case-mix growth (that is, case-mix growth unrelated to changes in patient acuity). Given the observed nominal case-mix growth of 2.32 percent in 2013 and 1.18 percent in 2014, the reduction to offset the nominal case-mix growth for these 2 years would be 3.41 percent (1 − 1/(1.0232 × 1.0118) = 0.0341).

We are proposing to implement this 3.41 percent reduction in equal increments over 2 years. Specifically, in addition to continuing our third year of implementation of the rebasing adjustments required under section 3131(a) of the Affordable Care Act, we are proposing to apply a 1.72 percent (1 − 1/(1.0232 × 1.0118)1/2 = 1.72 percent) reduction to the national, standardized 60-day episode payment rate each year for 2 years, CY 2016 and CY 2017, under the ongoing authority of section 1895(b)(3)(B)(iv) of the Act. These reductions would adjust the national, standardized 60-day episode payment rate to account for nominal case-mix growth between CY 2012 and CY 2014 built into the episode payment rate through the 2015 and 2016 budget neutrality factors. The reductions will result in Medicare paying more accurately for the delivery of home health services and are separate from the rebasing adjustments finalized in CY 2014 under section 1895(b)(3)(A)(iii) of the Act, which were calculated using CY 2012 claims and CY 2011 HHA cost report data (which was the most current, complete data at the time of the CY 2014 HH PPS proposed and final rules). We will continue to monitor case-mix growth and may consider whether to propose additional nominal case-mix reductions in future rulemaking.

We invite comments on the proposed reduction to the national, standardized 60-day episode payment amount of 1.72 percent in CY 2016 and 1.72 percent in CY 2017 to account for nominal case-mix growth from CY 2012 through CY 2014 and the associated changes in the regulations text at § 484.220.

C. CY 2016 Home Health Rate Update

1. CY 2016 Home Health Market Basket Update

Section 1895(b)(3)(B) of the Act requires that the standard prospective payment amounts for CY 2015 be increased by a factor equal to the applicable HH market basket update for those HHAs that submit quality data as required by the Secretary. The home health market basket was rebased and revised in CY 2013. A detailed description of how we derive the HHA market basket is available in the CY 2013 HH PPS final rule (77 FR 67080- 67090).

Section 3401(e) of the Affordable Care Act, adding new section 1895(b)(3)(B)(vi) to the Act, requires that, in CY 2015 (and in subsequent calendar years), the market basket percentage under the HHA prospective payment system as described in section 1895(b)(3)(B) of the Act be annually adjusted by changes in economy-wide productivity. The statute defines the productivity adjustment, described in section 1886(b)(3)(B)(xi)(II) of the Act, to be equal to the 10-year moving average of change in annual economy-wide private nonfarm business multifactor productivity (MFP) (as projected by the Secretary for the 10-year period ending with the applicable fiscal year, calendar year, cost reporting period, or other annual period) (the “MFP adjustment”). The Bureau of Labor Statistics (BLS) is the agency that publishes the official measure of private nonfarm business MFP. Please see http://www.bls.gov/mfp to obtain the BLS historical published MFP data. We note that the proposed methodology for calculating and applying the MFP adjustment to the HHA payment update is similar to the methodology used in other Medicare provider payment systems as required by section 3401 of the Affordable Care Act.

Multifactor productivity is derived by subtracting the contribution of labor and capital input growth from output growth. The projections of the components of MFP are currently produced by IGI, a nationally recognized economic forecasting firm with which CMS contracts to forecast the components of the market basket and MFP. As described in the CY 2015 HH PPS proposed rule (79 FR 38384 through 38386), in order to generate a forecast of MFP, IGI replicated the MFP measure calculated by the BLS using a series of proxy variables derived from IGI's U.S. macroeconomic models. In the CY 2015 HH PPS proposed rule, we identified each of the major MFP component series employed by the BLS to measure MFP as well as provided the corresponding concepts determined to be the best available proxies for the BLS series.

Beginning with the CY 2016 rulemaking cycle, the MFP adjustment is calculated using a revised series developed by IGI to proxy the aggregate capital inputs. Specifically, IGI has replaced the Real Effective Capital Stock used for Full Employment GDP with a forecast of BLS aggregate capital inputs recently developed by IGI using a regression model. This series provides a better fit to the BLS capital inputs as measured by the differences between the actual BLS capital input growth rates and the estimated model growth rates over the historical time period. Therefore, we are using IGI's most recent forecast of the BLS capital inputs series in the MFP calculations beginning with the CY 2016 rulemaking cycle. A complete description of the MFP projection methodology is available on our Web site at http://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/MedicareProgramRatesStats/MarketBasketResearch.html. Although we discuss the IGI changes to the MFP proxy series in this proposed rule, in the future, when IGI makes changes to the MFP methodology, we will announce them on our Web site rather than in the annual rulemaking.

Using IGI's first quarter 2015 forecast, the MFP adjustment for CY 2016 (the 10-year moving average of MFP for the period ending CY 2016) is projected to be 0.6 percent. Thus, in accordance with section 1895(b)(3)(B)(iii) of the Act, we propose to base the CY 2016 market basket update, which is used to determine the applicable percentage increase for the HH payments, on the most recent estimate of the proposed 2010-based HH market basket (currently estimated to be 2.9 percent based on IGI's first quarter 2015 forecast). We propose to then reduce this percentage increase by the current estimate of the MFP adjustment for CY 2016 of 0.6 percentage point (the 10-year moving average of MFP for the period ending CY 2016 based on IGI's first quarter 2015 forecast), in accordance with 1895(b)(3)(B)(vi). Therefore, the current estimate of the CY 2016 HH update is 2.3 percent (2.9 percent market basket update, less 0.6 percentage point MFP adjustment). Furthermore, we note that if more recent data are subsequently available (for example, a more recent estimate of the market basket and MFP adjustment), we would use such data to determine the CY 2016 market basket update and MFP adjustment in the final rule.

Section 1895(b)(3)(B) of the Act requires that the home health update be decreased by 2 percentage points for those HHAs that do not submit quality data as required by the Secretary. For HHAs that do not submit the required quality data for CY 2016, the home health update would be 0.3 percent (2.3 percent minus 2 percentage points).

2. CY 2016 Home Health Wage Index

a. Background

Sections 1895(b)(4)(A)(ii) and (b)(4)(C) of the Act require the Secretary to provide appropriate adjustments to the proportion of the payment amount under the HH PPS that account for area wage differences, using adjustment factors that reflect the relative level of wages and wage-related costs applicable to the furnishing of HH services. Since the inception of the HH PPS, we have used inpatient hospital wage data in developing a wage index to be applied to HH payments. We propose to continue this practice for CY 2016, as we continue to believe that, in the absence of HH-specific wage data, using inpatient hospital wage data is appropriate and reasonable for the HH PPS. Specifically, we propose to continue to use the pre-floor, pre-reclassified hospital wage index as the wage adjustment to the labor portion of the HH PPS rates. For CY 2016, the updated wage data are for hospital cost reporting periods beginning on or after October 1, 2011 and before October 1, 2012 (FY 2012 cost report data).

We would apply the appropriate wage index value to the labor portion of the HH PPS rates based on the site of service for the beneficiary (defined by section 1861(m) of the Act as the beneficiary's place of residence). Previously, we determined each HHA's labor market area based on definitions of metropolitan statistical areas (MSAs) issued by the Office of Management and Budget (OMB). In the CY 2006 HH PPS final rule (70 FR 68132), we adopted revised labor market area definitions as discussed in the OMB Bulletin No. 03-04 (June 6, 2003). This bulletin announced revised definitions for MSAs and the creation of micropolitan statistical areas and core-based statistical areas (CBSAs). The bulletin is available online at www.whitehouse.gov/omb/bulletins/b03-04.html. In adopting the CBSA geographic designations, we provided a one-year transition in CY 2006 with a blended wage index for all sites of service. For CY 2006, the wage index for each geographic area consisted of a blend of 50 percent of the CY 2006 MSA-based wage index and 50 percent of the CY 2006 CBSA-based wage index. We referred to the blended wage index as the CY 2006 HH PPS transition wage index. As discussed in the CY 2006 HH PPS final rule (70 FR 68132), since the expiration of this one-year transition on December 31, 2006, we have used the full CBSA-based wage index values.

In this proposed rule, we propose to continue to use the same methodology discussed in the CY 2007 HH PPS final rule (71 FR 65884) to address those geographic areas in which there are no inpatient hospitals, and thus, no hospital wage data on which to base the calculation of the CY 2015 HH PPS wage index. For rural areas that do not have inpatient hospitals, we would use the average wage index from all contiguous CBSAs as a reasonable proxy. For FY 2016, there are no rural geographic areas without hospitals for which we would apply this policy. For rural Puerto Rico, we would not apply this methodology due to the distinct economic circumstances that exist there (for example, due to the close proximity to one another of almost all of Puerto Rico's various urban and non-urban areas, this methodology would produce a wage index for rural Puerto Rico that is higher than that in half of its urban areas). Instead, we would continue to use the most recent wage index previously available for that area. For urban areas without inpatient hospitals, we would use the average wage index of all urban areas within the state as a reasonable proxy for the wage index for that CBSA. For CY 2016, the only urban area without inpatient hospital wage data is Hinesville, GA (CBSA 25980).

b. Update

On February 28, 2013, OMB issued Bulletin No. 13-01, announcing revisions to the delineations of MSAs, Micropolitan Statistical Areas, and CBSAs, and guidance on uses of the delineation of these areas. This bulletin is available online at http://www.whitehouse.gov/sites/default/files/omb/bulletins/2013/b-13-01.pdf. This bulletin states that it “provides the delineations of all Metropolitan Statistical Areas, Metropolitan Divisions, Micropolitan Statistical Areas, Combined Statistical Areas, and New England City and Town Areas in the United States and Puerto Rico based on the standards published on June 28, 2010, in the Federal Register (75 FR 37246-37252) and Census Bureau data.”

While the revisions OMB published on February 28, 2013 are not as sweeping as the changes made when we adopted the CBSA geographic designations for CY 2006, the February 28, 2013 bulletin does contain a number of significant changes. For example, there are new CBSAs, urban counties that have become rural, rural counties that have become urban, and existing CBSAs that have been split apart.

In the CY 2015 HH PPS final rule (79 FR 66085 through 66087), we finalized changes to the HH PPS wage index based on the newest OMB delineations, as described in OMB Bulletin No. 13-01, beginning in CY 2015, including a one-year transition with a blended wage index for CY 2015. Because the one-year transition period expires at the end of CY 2015, the proposed HH PPS wage index for CY 2016 is fully based on the revised OMB delineations adopted in CY 2015. The proposed CY 2016 wage index is available on the CMS Web site at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HomeHealthPPS/Home-Health-Prospective-Payment-System-Regulations-and-Notices.html.

3. CY 2016 Annual Payment Update

a. Background

The Medicare HH PPS has been in effect since October 1, 2000. As set forth in the July 3, 2000 final rule (65 FR 41128), the base unit of payment under the Medicare HH PPS is a national, standardized 60-day episode payment rate. As set forth in 42 CFR 484.220, we adjust the national, standardized 60-day episode payment rate by a case-mix relative weight and a wage index value based on the site of service for the beneficiary.

To provide appropriate adjustments to the proportion of the payment amount under the HH PPS to account for area wage differences, we apply the appropriate wage index value to the labor portion of the HH PPS rates. The labor-related share of the case-mix adjusted 60-day episode rate would continue to be 78.535 percent and the non-labor-related share would continue to be 21.465 percent as set out in the CY 2013 HH PPS final rule (77 FR 67068). The CY 2016 HH PPS rates would use the same case-mix methodology as set forth in the CY 2008 HH PPS final rule with comment period (72 FR 49762) and would be adjusted as described in section III.C. of this rule. The following are the steps we take to compute the case-mix and wage-adjusted 60-day episode rate:

1. Multiply the national 60-day episode rate by the patient's applicable case-mix weight.

2. Divide the case-mix adjusted amount into a labor (78.535 percent) and a non-labor portion (21.465 percent).

3. Multiply the labor portion by the applicable wage index based on the site of service of the beneficiary.

4. Add the wage-adjusted portion to the non-labor portion, yielding the case-mix and wage adjusted 60-day episode rate, subject to any additional applicable adjustments.

In accordance with section 1895(b)(3)(B) of the Act, this document constitutes the annual update of the HH PPS rates. Section 484.225 sets forth the specific annual percentage update methodology. In accordance with § 484.225(i), for a HHA that does not submit HH quality data, as specified by the Secretary, the unadjusted national prospective 60-day episode rate is equal to the rate for the previous calendar year increased by the applicable HH market basket index amount minus two percentage points. Any reduction of the percentage change would apply only to the calendar year involved and would not be considered in computing the prospective payment amount for a subsequent calendar year.

Medicare pays the national, standardized 60-day case-mix and wage-adjusted episode payment on a split percentage payment approach. The split percentage payment approach includes an initial percentage payment and a final percentage payment as set forth in § 484.205(b)(1) and (b)(2). We may base the initial percentage payment on the submission of a request for anticipated payment (RAP) and the final percentage payment on the submission of the claim for the episode, as discussed in § 409.43. The claim for the episode that the HHA submits for the final percentage payment determines the total payment amount for the episode and whether we make an applicable adjustment to the 60-day case-mix and wage-adjusted episode payment. The end date of the 60-day episode as reported on the claim determines which calendar year rates Medicare would use to pay the claim.

We may also adjust the 60-day case-mix and wage-adjusted episode payment based on the information submitted on the claim to reflect the following:

  • A low-utilization payment adjustment (LUPA) is provided on a per-visit basis as set forth in § 484.205(c) and § 484.230.
  • A partial episode payment (PEP) adjustment as set forth in § 484.205(d) and § 484.235.
  • An outlier payment as set forth in § 484.205(e) and § 484.240.

b. Proposed CY 2016 National, Standardized 60-Day Episode Payment Rate

Section 1895(3)(A)(i) of the Act required that the 60-day episode base rate and other applicable amounts be standardized in a manner that eliminates the effects of variations in relative case mix and area wage adjustments among different home health agencies in a budget neutral manner. To determine the CY 2016 national, standardized 60-day episode payment rate, we would apply a wage index standardization factor, a case-mix budget neutrality factor described in section III.B.1, a nominal case-mix growth adjustment described in section III.B.2, the rebasing adjustment described in section II.C, and the MFP-adjusted home health market basket update discussed in section III.C.1 of this proposed rule.

To calculate the wage index standardization factor, henceforth referred to as the wage index budget neutrality factor, we simulated total payments for non-LUPA episodes using the 2016 wage index and compared it to our simulation of total payments for non-LUPA episodes using the 2015 wage index. By dividing the total payments for non-LUPA episodes using the 2016 wage index by the total payments for non-LUPA episodes using the 2015 wage index, we obtain a wage index budget neutrality factor of 1.0006. We would apply the wage index budget neutrality factor of 1.0006 to the CY 2016 national, standardized 60-day episode rate.

As discussed in section III.B.1 of this proposed rule, to ensure the changes to the case-mix weights are implemented in a budget neutral manner, we would apply a case-mix weight budget neutrality factor to the CY 2016 national, standardized 60-day episode payment rate. The case-mix weight budget neutrality factor is calculated as the ratio of total payments when CY 2016 case-mix weights are applied to CY 2014 utilization (claims) data to total payments when CY 2015 case-mix weights are applied to CY 2014 utilization data. The case-mix budget neutrality factor for CY 2016 would be 1.0141 as described in section III.B.1 of this proposed rule.

Next, as discussed in section III.B.2 of this proposed rule, we would apply a reduction of 1.72 percent to the national, standardized 60-day episode payment rate in CY 2016 to account for nominal case-mix growth between CY 2012 and CY 2014. Then, we would apply the −$80.95 rebasing adjustment finalized in the CY 2014 HH PPS final rule (78 FR 72256) and discussed in section II.C. Lastly, we would update the payment rates by the CY 2016 HH payment update percentage of 2.3 percent (MFP-adjusted home health market basket update) as described in section III.C.1 of this proposed rule. The CY 2016 national, standardized 60-day episode payment rate is calculated in Table 10.

Table 10—CY 2016 60-Day National, Standardized 60-Day Episode Payment Amount

CY 2015 National, standardized 60-day episode payment Wage index budget neutrality factor Case-mix weights budget neutrality factor Nominal case-mix growth adjustment (1−0.0172) CY 2016 Rebasing adjustment CY 2016 HH Payment update percentage CY 2016 National, standardized 60-day episode payment
$2,961.38 × 1.0006 × 1.0141 × 0.9828 −$80.95 × 1.023 $2,938.37

The CY 2016 national, standardized 60-day episode payment rate for an HHA that does not submit the required quality data is updated by the CY 2016 HH payment update (2.3 percent) minus 2 percentage points and is shown in Table 11.

Table 11—For HHAs That Do Not Submit the Quality Data—CY 2015 National, Standardized 60-Day Episode Payment Amount

CY 2015 National, standardized 60-day episode payment Wage index budget neutrality factor Case-mix weights budget neutrality factor Nominal case-mix growth adjustment (1−0.0172) CY 2016 Rebasing adjustment CY 2016 HH Payment update percentage minus 2 percentage points CY 2016 National, standardized 60-day episode payment
$2,961.38 × 1.0006 × 1.0141 × 0.9828 −$80.95 × 1.003 $2,880.92

c. CY 2016 National Per-Visit Rates

The national per-visit rates are used to pay LUPAs (episodes with four or fewer visits) and are also used to compute imputed costs in outlier calculations. The per-visit rates are paid by type of visit or HH discipline. The six HH disciplines are as follows:

  • Home health aide (HH aide);
  • Medical Social Services (MSS);
  • Occupational therapy (OT);
  • Physical therapy (PT);
  • Skilled nursing (SN); and
  • Speech-language pathology (SLP).

To calculate the CY 2016 national per-visit rates, we start with the CY 2015 national per-visit rates. We then apply a wage index budget neutrality factor to ensure budget neutrality for LUPA per-visit payments and increase each of the six per-visit rates by the maximum rebasing adjustments described in section II.C. of this rule. We calculate the wage index budget neutrality factor by simulating total payments for LUPA episodes using the 2016 wage index and comparing it to simulated total payments for LUPA episodes using the 2015 wage index. By dividing the total payments for LUPA episodes using the 2016 wage index by the total payments for LUPA episodes using the 2015 wage index, we obtain a wage index budget neutrality factor of 1.0006. We would apply the wage index budget neutrality factor of 1.0006 to the CY 2016 national per-visit rates.

The LUPA per-visit rates are not calculated using case-mix weights. Therefore, there is no case-mix weights budget neutrality factor needed to ensure budget neutrality for LUPA payments. Finally, the per-visit rates for each discipline are updated by the CY 2016 HH payment update percentage of 2.3 percent. The national per-visit rates are adjusted by the wage index based on the site of service of the beneficiary. The per-visit payments for LUPAs are separate from the LUPA add-on payment amount, which is paid for episodes that occur as the only episode or initial episode in a sequence of adjacent episodes. The CY 2016 national per-visit rates are shown in Tables 12 and 13.

Table 12—CY 2016 National Per-Visit Payment Amounts for HHAs That DO Submit the Required Quality Data

HH Discipline type CY 2015 Per-visit payment Wage index budget neutrality factor CY 2016 Rebasing adjustment CY 2016 HH Payment update percentage CY 2016 Per-visit payment
Home Health Aide $57.89 × 1.0006 + $1.79 × 1.023 $61.09
Medical Social Services 204.91 × 1.0006 + 6.34 × 1.023 216.23
Occupational Therapy 140.70 × 1.0006 + 4.35 × 1.023 148.47
Physical Therapy 139.75 × 1.0006 + 4.32 × 1.023 147.47
Skilled Nursing 127.83 × 1.0006 + 3.96 × 1.023 134.90
Speech-Language Pathology 151.88 × 1.0006 + 4.70 × 1.023 160.27

The CY 2016 per-visit payment rates for an HHA that does not submit the required quality data are updated by the CY 2016 HH payment update (2.3 percent) minus 2 percentage points and is shown in Table 13.

Table 13—CY 2016 National Per-Visit Payment Amounts for HHAs That DO NOT Submit the Required Quality Data

HH Discipline type CY 2015 Per-visit rates Wage index budget neutrality factor CY 2016 Rebasing adjustment CY 2016 HH Payment update percentage minus 2 percentage points CY 2016 Per-visit rates
Home Health Aide $57.89 × 1.0006 + $1.79 × 1.003 $59.89
Medical Social Services 204.91 × 1.0006 + 6.34 × 1.003 212.01
Occupational Therapy 140.70 × 1.0006 + 4.35 × 1.003 145.57
Physical Therapy 139.75 × 1.0006 + 4.32 × 1.003 144.59
Skilled Nursing 127.83 × 1.0006 + 3.96 × 1.003 132.26
Speech-Language Pathology 151.88 × 1.0006 + 4.70 × 1.003 157.14

d. Low-Utilization Payment Adjustment (LUPA) Add-On Factors

LUPA episodes that occur as the only episode or as an initial episode in a sequence of adjacent episodes are adjusted by applying an additional amount to the LUPA payment before adjusting for area wage differences. In the CY 2014 HH PPS final rule, we changed the methodology for calculating the LUPA add-on amount by finalizing the use of three LUPA add-on factors: 1.8451 for SN; 1.6700 for PT; and 1.6266 for SLP (78 FR 72306). We multiply the per-visit payment amount for the first SN, PT, or SLP visit in LUPA episodes that occur as the only episode or an initial episode in a sequence of adjacent episodes by the appropriate factor to determine the LUPA add-on payment amount. For example, for LUPA episodes that occur as the only episode or an initial episode in a sequence of adjacent episodes, if the first skilled visit is SN, the payment for that visit would be $248.90 (1.8451 multiplied by $134.90), subject to area wage adjustment.

e. CY 2016 Non-Routine Medical Supply (NRS) Payment Rates

Payments for NRS are computed by multiplying the relative weight for a particular severity level by the NRS conversion factor. To determine the CY 2016 NRS conversion factor, we start with the 2015 NRS conversion factor ($53.23) and apply the −2.82 percent rebasing adjustment described in section II.C. of this rule (1−0.0282 = 0.9718). We then update the conversion factor by the CY 2016 HH payment update percentage (2.3 percent). We do not apply a standardization factor as the NRS payment amount calculated from the conversion factor is not wage or case-mix adjusted when the final claim payment amount is computed. The NRS conversion factor for CY 2016 is shown in Table 14.

Table 14—CY 2016 NRS Conversion Factor for HHAs That DO Submit the Required Quality Data

CY 2015 NRS conversion factor CY 2016 Rebasing adjustment CY 2016 HH Payment update percentage CY 2016 NRS conversion factor
$53.23 × 0.9718 × 1.023 $52.92

Using the CY 2015 NRS conversion factor, the payment amounts for the six severity levels are shown in Table 15.

Table 15—CY 2016 NRS Payment Amounts for HHAs That DO Submit the Required Quality Data

Severity level Points (scoring) Relative weight CY 2016 NRS Payment amounts
1 0 0.2698 $14.28
2 1 to 14 0.9742 51.55
3 15 to 27 2.6712 141.36
4 28 to 48 3.9686 210.02
5 49 to 98 6.1198 323.86
6 99+ 10.5254 557.00

For HHAs that do not submit the required quality data, we again begin with the CY 2015 NRS conversion factor ($53.23) and apply the −2.82 percent rebasing adjustment discussed in section II.C of this proposed rule (1−0.0282= 0.9718). We then update the NRS conversion factor by the CY 2016 HH payment update percentage (2.3 percent) minus 2 percentage points. The CY 2016 NRS conversion factor for HHAs that do not submit quality data is shown in Table 16.

Table 16—CY 2016 NRS Conversion Factor for HHAs That DO NOT Submit the Required Quality Data

CY 2015 NRS Conversion factor CY 2016 Rebasing adjustment CY 2016 HH Payment update percentage minus 2 percentage points CY 2016 NRS Conversion factor
$53.23 × 0.9718 × 1.003 $51.88

The payment amounts for the various severity levels based on the updated conversion factor for HHAs that do not submit quality data are calculated in Table 17.

Table 17—CY 2016 NRS Payment Amounts for HHAs That DO NOT Submit the Required Quality Data

Severity level Points (scoring) Relative weight CY 2016 NRS Payment amounts
1 0 0.2698 $14.00
2 1 to 14 0.9742 50.54
3 15 to 27 2.6712 138.58
4 28 to 48 3.9686 205.89
5 49 to 98 6.1198 317.50
6 99+ 10.5254 546.06

f. Rural Add-On

Section 421(a) of the MMA required, for HH services furnished in a rural areas (as defined in section 1886(d)(2)(D) of the Act), for episodes or visits ending on or after April 1, 2004, and before April 1, 2005, that the Secretary increase the payment amount that otherwise would have been made under section 1895 of the Act for the services by 5 percent.

Section 5201 of the DRA amended section 421(a) of the MMA. The amended section 421(a) of the MMA required, for HH services furnished in a rural area (as defined in section 1886(d)(2)(D) of the Act), on or after January 1, 2006 and before January 1, 2007, that the Secretary increase the payment amount otherwise made under section 1895 of the Act for those services by 5 percent.

Section 3131(c) of the Affordable Care Act amended section 421(a) of the MMA to provide an increase of 3 percent of the payment amount otherwise made under section 1895 of the Act for HH services furnished in a rural area (as defined in section 1886(d)(2)(D) of the Act), for episodes and visits ending on or after April 1, 2010, and before January 1, 2016.

Section 210 of the Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) (Pub. L. 114-10) amended section 421(a) of the MMA to extend the rural add-on by providing an increase of 3 percent of the payment amount otherwise made under section 1895 of the Act for HH services provided in a rural area (as defined in section 1886(d)(2)(D) of the Act), for episodes and visits ending before January 1, 2018.

Section 421 of the MMA, as amended, waives budget neutrality related to this provision, as the statute specifically states that the Secretary shall not reduce the standard prospective payment amount (or amounts) under section 1895 of the Act applicable to HH services furnished during a period to offset the increase in payments resulting in the application of this section of the statute.

For CY 2016, home health payment rates for services provided to beneficiaries in areas that are defined as rural under the OMB delineations would be increased by 3 percent as mandated by section 210 of the MACRA. The 3 percent rural add-on is applied to the national, standardized 60-day episode payment rate, national per visit rates, and NRS conversion factor when HH services are provided in rural (non-CBSA) areas. Refer to Tables 18 through 21 for these payment rates.

Table 18—CY 2016 Payment Amounts for 60-Day Episodes for Services Provided in a Rural Area

For HHAs that DO submit quality data For HHAs that DO NOT submit quality data
CY 2016 National, standardized 60-day episode payment rate Multiply by the 3 percent rural add-on CY 2016 Rural national, standardized 60-day episode payment rate CY 2016 National, standardized 60-day episode payment rate Multiply by the 3 percent rural add-on CY 2016 Rural national, standardized 60-day episode payment rate
$2,938.37 × 1.03 $3,026.52 $2,880.92 × 1.03 $2,967.35

Table 19—CY 2016 Per-Visit Amounts for Services Provided in a Rural Area

HH Discipline type For HHAs that DO submit quality data For HHAs that DO NOT submit quality data
CY 2016 Per-visit rate Multiply by the 3 percent rural add-on CY 2016 Rural per-visit rates CY 2016 Per-visit rate Multiply by the 3 percent rural add-on CY 2016 Rural per-visit rates
HH Aide $61.09 × 1.03 $62.92 $59.89 × 1.03 $61.69
MSS 216.23 × 1.03 222.72 212.01 × 1.03 218.37
OT 148.47 × 1.03 152.92 145.57 × 1.03 149.94
PT 147.47 × 1.03 151.89 144.59 × 1.03 148.93
SN 134.90 × 1.03 138.95 132.26 × 1.03 136.23
SLP 160.27 × 1.03 165.08 157.14 × 1.03 161.85

Table 20—CY 2016 NRS Conversion Factor for Services Provided in Rural Areas

For HHAs that DO submit quality data For HHAs that DO NOT submit quality data
CY 2016 Conversion factor Multiply by the 3 percent rural add-on CY 2016 Rural NRS conversion factor CY 2016 Conversion factor Multiply by the 3 percent rural add-on CY 2016 Rural NRS conversion factor
$52.92 × 1.03 $54.51 $51.88 × 1.03 $53.44

Table 21—CY 2016 NRS Payment Amounts for Services Provided in Rural Areas

Severity level Points (scoring) For HHAs that DO submit quality data (CY 2016 NRS conversion factor = $54.51 For HHAs that DO NOT submit quality data (CY 2016 NRS Conversion Factor = $53.44)
Relative weight CY 2016 NRS Payment amounts for rural areas Relative weight CY 2016 NRS Payment amounts for rural areas
1 0 0.2698 $14.71 0.2698 $14.42
2 1 to 14 0.9742 53.10 0.9742 52.06
3 15 to 27 2.6712 145.61 2.6712 142.75
4 28 to 48 3.9686 216.33 3.9686 212.08
5 49 to 98 6.1198 333.59 6.1198 327.04
6 99+ 10.5254 573.74 10.5254 562.48

D. Payments for High-Cost Outliers Under the HH PPS

1. Background

Section 1895(b)(5) of the Act allows for the provision of an addition or adjustment to the national, standardized 60-day case-mix and wage-adjusted episode payment amounts in the case of episodes that incur unusually high costs due to patient care needs. Prior to the enactment of the Affordable Care Act, section 1895(b)(5) of the Act stipulated that projected total outlier payments could not exceed 5 percent of total projected or estimated HH payments in a given year. In the July 3, 2000 Medicare Program; Prospective Payment System for Home Health Agencies final rule (65 FR 41188 through 41190), we described the method for determining outlier payments. Under this system, outlier payments are made for episodes whose estimated costs exceed a threshold amount for each HH Resource Group (HHRG). The episode's estimated cost is the sum of the national wage-adjusted per-visit payment amounts for all visits delivered during the episode. The outlier threshold for each case-mix group or Partial Episode Payment (PEP) adjustment is defined as the 60-day episode payment or PEP adjustment for that group plus a fixed-dollar loss (FDL) amount. The outlier payment is defined to be a proportion of the wage-adjusted estimated cost beyond the wage-adjusted threshold. The threshold amount is the sum of the wage and case-mix adjusted PPS episode amount and wage-adjusted FDL amount. The proportion of additional costs over the outlier threshold amount paid as outlier payments is referred to as the loss-sharing ratio.

In the CY 2010 HH PPS final rule (74 FR 58080 through 58087), we discussed excessive growth in outlier payments, primarily the result of unusually high outlier payments in a few areas of the country. Despite program integrity efforts associated with excessive outlier payments in targeted areas of the country, we discovered that outlier expenditures still exceeded the 5 percent target and, in the absence of corrective measures, would continue do to so. Consequently, we assessed the appropriateness of taking action to curb outlier abuse. To mitigate possible billing vulnerabilities associated with excessive outlier payments and adhere to our statutory limit on outlier payments, we adopted an outlier policy that included a 10 percent agency-level cap on outlier payments. This cap was implemented in concert with a reduced FDL ratio of 0.67. These policies resulted in a projected target outlier pool of approximately 2.5 percent. (The previous outlier pool was 5 percent of total HH expenditure). For CY 2010, we first returned the 5 percent held for the previous target outlier pool to the national, standardized 60-day episode rates, the national per-visit rates, the LUPA add-on payment amount, and the NRS conversion factor. Then, we reduced the CY 2010 rates by 2.5 percent to account for the new outlier pool of 2.5 percent. This outlier policy was adopted for CY 2010 only.

As we noted in the CY 2011 HH PPS final rule (75 FR 70397 through 70399), section 3131(b)(1) of the Affordable Care Act amended section 1895(b)(3)(C) of the Act, and requires the Secretary to reduce the HH PPS payment rates such that aggregate HH PPS payments are reduced by 5 percent. In addition, section 3131(b)(2) of the Affordable Care Act amended section 1895(b)(5) of the Act by re-designating the existing language as section 1895(b)(5)(A) of the Act, and revising it to state that the Secretary may provide for an addition or adjustment to the payment amount for outlier episodes because of their unusual variation in the type or amount of medically necessary care. The total amount of the additional payments or payment adjustments for outlier episodes may not exceed 2.5 percent of the estimated total HH PPS payments for that year and outlier payments as a percent of total payments are capped for each HHA at 10 percent.

As such, beginning in CY 2011, our HH PPS outlier policy is that we reduce payment rates by 5 percent and target up to 2.5 percent of total estimated HH PPS payments to be paid as outliers. To do so, we first returned the 2.5 percent held for the target CY 2010 outlier pool to the national, standardized 60-day episode rates, the national per visit rates, the LUPA add-on payment amount, and the NRS conversion factor for CY 2010. We then reduced the rates by 5 percent as required by section 1895(b)(3)(C) of the Act, as amended by section 3131(b)(1) of the Affordable Care Act. For CY 2011 and subsequent calendar years we target up to 2.5 percent of estimated total payments to be paid as outlier payments, and apply a 10 percent agency-level outlier cap.

2. Fixed Dollar Loss (FDL) Ratio and Loss-Sharing Ratio

For a given level of outlier payments, there is a trade-off between the values selected for the FDL ratio and the loss-sharing ratio. A high FDL ratio reduces the number of episodes that can receive outlier payments, but makes it possible to select a higher loss-sharing ratio, and therefore, increase outlier payments for qualifying outlier episodes. Alternatively, a lower FDL ratio means that more episodes can qualify for outlier payments, but outlier payments per episode must then be lower.

The FDL ratio and the loss-sharing ratio must be selected so that the estimated total outlier payments do not exceed the 2.5 percent aggregate level (as required by section 1895(b)(5)(A) of the Act). Historically, we have used a value of 0.80 for the loss-sharing ratio which, we believe, preserves incentives for agencies to attempt to provide care efficiently for outlier cases. With a loss-sharing ratio of 0.80, Medicare pays 80 percent of the additional estimated costs above the outlier threshold amount.

In the CY 2011 HH PPS final rule (75 FR 70398), in targeting total outlier payments as 2.5 percent of total HH PPS payments, we implemented an FDL ratio of 0.67, and we maintained that ratio in CY 2012. Simulations based on CY 2010 claims data completed for the CY 2013 HH PPS final rule showed that outlier payments were estimated to comprise approximately 2.18 percent of total HH PPS payments in CY 2013, and as such, we lowered the FDL ratio from 0.67 to 0.45. We stated that lowering the FDL ratio to 0.45, while maintaining a loss-sharing ratio of 0.80, struck an effective balance of compensating for high-cost episodes while allowing more episodes to qualify as outlier payments (77 FR 67080). The national, standardized 60-day episode payment amount is multiplied by the FDL ratio. That amount is wage-adjusted to derive the wage-adjusted FDL amount, which is added to the case-mix and wage-adjusted 60-day episode payment amount to determine the outlier threshold amount that costs have to exceed before Medicare would pay 80 percent of the additional estimated costs.

For this proposed rule, simulating payments using preliminary CY 2014 claims data (as of December 31, 2014) and the CY 2015 payment rates (79 FR 66088 through 66092), we estimate that outlier payments in CY 2015 would comprise 2.02 percent of total payments. Based on simulations using CY 2014 claims data and the CY 2016 payments rates in section III.C.3 of this proposed rule, we estimate that outlier payments would comprise approximately 2.34 percent of total HH PPS payments in CY 2016, a percent change of almost 16 percent. This increase is attributable to the increase in the national per-visit amounts through the rebasing adjustments and the decrease in the national, standardized 60-day episode payment amount as a result of the rebasing adjustment and the nominal case-mix growth reduction. Given similar rebasing adjustments and case-mix growth reduction would also occur for 2017, and hence a similar anticipated increase in the outlier payments, we estimate that for CY 2017 outlier payments as a percent of total HH PPS payments would exceed 2.5 percent.

At this time, we are not proposing a change to the FDL ratio or loss-sharing ratio for CY 2016 as we believe that maintaining an FDL of 0.45 and a loss-sharing ratio of 0.80 are appropriate given the percentage of outlier payments is estimated to increase as a result of the increase in the national per-visit amounts through the rebasing adjustments and the decrease in the national, standardized 60-day episode payment amount as a result of the rebasing adjustment and nominal case-mix growth reduction. In the final rule, we will update our estimate of outlier payments as a percent of total HH PPS payments using the most current and complete year of HH PPS data (CY 2014 claims data as of June 30, 2015). We would continue to monitor the percent of total HH PPS payments paid as outlier payments to determine if future adjustments to either the FDL ratio or loss-sharing ratio are warranted.

E. Report to Congress on the Home Health Study Required by Section 3131(d) of the Affordable Care Act and an Update on Subsequent Research and Analysis

The current home health prospective payment system (HH PPS) pays a determined amount for a 60-day episode of care adjusted for case mix using 153 home health resource groups (HHRGs). The 153 HHRGs are determined based on the amount of therapy provided, the episode's timing in a sequence of episodes, and the patient's clinical and functional status determined from data reported on the Outcome and Assessment Information Set (OASIS). There has been criticism that home health providers have responded to Medicare's payment policy by altering the level of service provided to patients. A review of the literature increasingly indicates that the current HH PPS payment model drives HHA resource allocation and practice decisions. Specifically, research has highlighted the need to examine whether there are vulnerabilities present within the current HH PPS model that provide disincentives for serving the most clinically complex and vulnerable beneficiaries who receive home health care while incentivizing providers to provide more therapy service than needed to increase their reimbursement. There is increasing concern that the current home health payment system encourages home health providers to deliver the maximum volume of therapy services while restricting the number of skilled nursing and home health aide services because of the therapy payment thresholds.

Rosati, R., Russell, D., Peng, T., Brickner, C., Kurowski, D., Christopher, M.A., Sheehan, K. (2014). Medicare Home Health Payment Reform May Jeopardize Access for Clinically Complex and Socially Vulnerable Patients. Health Affairs. 33(6), 946-956. Doi: 10.1377/hlthaff.2013.1159

Cabin, W. (2009). Evidence-based Research Challenges Home Care PPS Patient Benefits, Costs, and Payment Structure. Home Health Care Management and Practice. 21(4), 240-245. Doi: 10.1177/10848223088328325

Ibid.

Rosati, R., Russell, D., Peng, T., Brickner, C., Kurowski, D., Christopher, M.A., Sheehan, K. (2014). Medicare Home Health Payment Reform May Jeopardize Access for Clinically Complex and Socially Vulnerable Patients. Health Affairs. 33(6), 946-956. Doi: 10.1377/hlthaff.2013.1159

This raises the question whether there is a disparity in payment for those patients with clinically complex and/or poorly controlled chronic conditions who do not qualify for therapy but require a large number of skilled nursing visits.

Ibid.

Section 3131(d) of the Affordable Care Act directed the Secretary to conduct a study on HHA costs involved with providing ongoing access to care to low-income Medicare beneficiaries or beneficiaries in medically underserved areas, and in treating beneficiaries with high levels of severity of illness. To examine access to Medicare home health services and payment, relative to cost, for the vulnerable patient populations, we awarded a contract to L&M Policy Research to perform extensive analysis of both survey and administrative data. Specifically, the L&M collected survey data from physicians and HHAs to examine factors associated with potential access to care issues. The surveys provided information on whether, and the reasons as to why, patients were not placed or admitted for home health services or experienced delays in receiving home health services, and information on the characteristics of patients who may have experienced access issues. L&M also analyzed administrative data through descriptive and regression analyses to examine the relationship between patient characteristics and estimated financial margin (difference between payment and estimated cost). The study focused on margins because margin differences, particularly those associated with patient characteristics, indicate that financial incentives may exist in the HH PPS to provide home health care for certain types of patients over others. Lower margins, if systematically associated with care for vulnerable patient populations, may indicate financial disincentives for HHAs to admit these patients and may create access to care issues for them.

The results of the survey revealed that over 80 percent of HHAs and over 90 percent of physicians reported that access to home health care for Medicare fee-for-service beneficiaries in their local area was excellent or good. When survey respondents reported access issues, specifically their inability to place or admit Medicare fee-for-service patients into home health, the most common reason reported was that the patients did not qualify for the Medicare home health benefit. HHAs and physicians also cited family or caregiver issues as an important contributing factor in the inability to admit or place patients. About 17.2 percent of HHAs and 16.7 percent of physicians reported insufficient payment as an important contributing factor in the inability to admit or place patients. The survey results suggest that much of the variation in access to Medicare home health services is associated with social and personal conditions and therefore CMS' ability to improve access for certain vulnerable patient populations through payment policy may be limited.

Analysis of CY 2010 HHA payment and cost data suggests that margins may differ substantially across the HH PPS case-mix groups. In addition, particular beneficiary characteristics appear to be strongly associated with margin, and thus may create financial incentives to select certain patients over others. Margins were estimated to be lower in CY 2010 for patients who required parenteral nutrition, who had traumatic wounds or ulcers, or required substantial assistance in bathing. Given that these variables are already included in the HH PPS case-mix system, the results indicate that modifications to the case-mix system may be needed. Furthermore, in CY 2010, beneficiaries admitted after acute or post-acute stays or who had high Hierarchical Condition Category scores or certain poorly-controlled clinical conditions, such as poorly-controlled pulmonary disorders, were also associated with substantially lower home health margins. In addition, other characteristics, such as those describing assistance by informal caregivers for ADL needs and those describing socio- economic status, such as dual eligibility for Medicare and Medicaid, were strongly associated with lower margins. Exploration of potential payment methodology changes indicated that accounting for additional variables in HH PPS payment may decrease the difference in estimated margin between individuals in specific vulnerable subgroups and those not in the subgroups, thereby potentially decreasing financial incentives to select certain types of patients over others.

CMS awarded a follow-on contract to Abt Associates to further explore margin differences across patient characteristics and possible payment methodology changes suggested by the results of the home health study. Additionally, we have heard from various stakeholders that the current payment system methodology is overly complex and does not fully reflect the range of services provided under the home health benefit, and thus this follow-on study would look at these aspects of the current payment system as well.

Under the follow-on contract, Abt Associates convened a Clinical Workgroup meeting on June 25, 2014 to gain clinical insight from industry regarding the current HH PPS. Based upon the feedback provided during the Clinical Workgroup meeting, as well as CMS concerns about the current model given the findings from the Home Health Study, Abt Associates was tasked with developing model options for consideration and discussion. In September 2014, Abt Associates presented several payment model options for CMS consideration, which were also presented to a Technical Expert Panel meeting held on January 8, 2015.

  • Diagnosis on Top Model:

The first model option, referred to as the “Diagnosis on Top” (DOT) model, combines diagnosis groups with a regression model to create separate weights for patients with different diagnoses. For its “Studies in Home Health Case Mix” project design report (January 7, 2002), Abt had explored the possibility of a DOT model for the home health payment system. At that time, there was a decision that the potential gains in payment accuracy which would result from implementing a DOT model were offset by the added complexity and burden to providers that a DOT model could introduce by requiring providers to classify their patients with a single diagnosis that would be used to determine payment. For present reform efforts, Abt revisited the DOT model with more current data and in the context of other potential changes to the payment system which a DOT model might be able to complement. In this analysis, we are removing the therapy variable, allowing us to explore new ideas and re-explore previously rejected ideas to see how we can increase the statistical power of the model without the therapy variable. In this most recent analysis, each episode is grouped into the following diagnosis groups based on the primary ICD-9-CM diagnosis code reported on the OASIS: (1) Orthopedic; (2) neurological; (3) diabetes; (4) cancer; (5) skin wounds & lesions; (6) cardiovascular; (7) pulmonary; (8) gastrointestinal; (9) genito-urinary; (10) mental/emotional disorders; (11) other diagnoses; (12) case-mix V-codes; and (13) non-case-mix V-codes. Unlike the current HH PPS case-mix system, the diagnosis on top model does not include any therapy thresholds. Under the diagnosis on top model, episodes are first divided into different diagnosis groups, prior to the determination of the clinical and functional levels, and payment model regressions would be run separately for each diagnosis group. This is intended to maximize the statistical performance of the payment system. The work conducted by Abt Associates also included OASIS and non-OASIS items (such as whether the patient was admitted from an acute or post-acute care setting and hierarchical condition categories) not used in the current payment system, but shown to correlate with resource use. In many ways, the regression component of the diagnosis on top model is very similar to the current 4-equation model except that, in later versions of Abt's work on the diagnosis on top model, the clinical and functional levels are replaced with an overall severity level. This change allows the diagnosis on top model to account for a richer set of variables than the clinical and functional levels in the current payment system.

Predicted Therapy Model:

The second model option is referred to as the “Predicted Therapy Model.” The basic structure of this model is similar to that of the current payment model. In this model option, actual therapy visits used in the current HH PPS model are replaced with predicted therapy visits to develop case mix weights and payment amounts based on the predicted number of visits. The weights are constructed via a two-part model. The first part of the model uses a logistic regression to estimate whether or not the episode had any therapy visits. The second part of this predicted therapy model uses a truncated binomial regression (truncated at zero) to estimate the amount of therapy visits conditional on having any therapy visits. This “hurdle” model is commonly used in health economics to describe medical utilization or expenditures where observing zero health care use during the sample period is common. We also looked at estimating the two part model for each of the diagnosis groups in the diagnosis on top model referenced above. The predicted therapy model still includes the four-equation model, the payment regression, and the 153 HHRGs as in the current payment model.

“Modeling Health Care Costs and Counts,” ASHE conference course by Partha Deb, Willard Manning and Edward Norton, http://web.harrisschool.uchicago.edu/sites/default/files/ASHE2012_Minicourse_Cost_Use_slides_corrected.pdf

Home Health Groupings Model:

The third model is referred to as the “Home Health Groupings ” (HHG) model. The premise of this type of model is that it starts with a clinical foundation. This groupings model groups home health episodes by diagnoses and the expected types of home health interventions required. Using expert clinical judgment, each ICD-9 code is assigned to one of seven groups based on the intervention expected to be required. Those seven groups include: (1) Musculoskeletal Rehabilitation; (2) Neuro/Stroke Rehabilitation; (3) Skin/Non-Surgical Wound Care; (4) Post-Op Wound Aftercare; (5) Behavioral Health Care; (6) Complex Medical Care; and (7) Medication Management, Teaching, and Assessment. Unlike the current HH PPS case-mix system, the home health groupings model does not include any therapy thresholds. Abt Associates is currently in the process of further delineating the seven groups listed above using OASIS and non-OASIS items (such as whether the patient was admitted from an acute or post-acute care setting and hierarchical condition categories) not used in the current payment system, but shown to correlate with resource use. The HHG model groups home health episodes in a way that mirrors how clinicians would differentiate between different types of beneficiaries and would help explain why the beneficiary is receiving home health, something that the current HH PPS case-mix may be lacking. MedPAC noted that policy makers have faced challenges in defining the role of home health. We believe that the HHG model may be one way to better define the types of care that patients receive under the home health benefit and thus the role of home care.

Medicare Payment Advisory Commission (MedPAC), “Report to the Congress: Medicare Payment Policy”. March 2015. P. 219. Washington, DC. Accessed on 5/5/2015 at: http://medpac.gov/documents/reports/march-2015-report-to-the-congress-medicare-payment-policy.pdf?sfvrsn=0 .

To inform the model options discussed above, Abt Associates also reviewed other Medicare prospective payment systems to identify alternative methods used in classifying patients and to better understand components of each system. In the future, we plan to issue a technical report under our contract with Abt Associates that would further describe and analyze the three model options. We also plan to reconvene the Clinical Workgroup and the Technical Experts Panel in the near future to help further inform CMS on the various model options developed and next steps.

F. Technical Regulations Text Changes

First, we propose to make several technical corrections in part 484 to better align the payment requirements with recent statutory and regulatory changes for home health services. We propose to make changes to § 484. 205(e) to state that estimated total outlier payments for a given calendar year are limited to no more than 2.5 percent of total outlays under the HHA PPS, rather than 5 percent of total outlays, as required by section 1895(b)(5)(A) of the Act as amended by section 3131(b)(2)(B) of the Affordable Care Act. Similarly, we also propose to specify in § 484.240(e) that the fixed dollar loss and the loss sharing amounts are chosen so that the estimated total outlier payment is no more than 2.5 percent of total payments under the HH PPS, rather than 5 percent of total payments under the HH PPS as required by section 1895(b)(5)(A) of the Act as amended by section 3131(b)(2)(B) of the Affordable Care Act. We also propose to describe in § 484.240(f) that the estimated total amount of outlier payments to an HHA in a given year may not exceed 10 percent of the estimated total payments to the specific agency under the HH PPS in a given year. This update aligns the regulations text at § 484.240(f) with the statutory requirement in 1895(b)(5)(A) of the Act as amended by section 3131(b)(2)(B) of the Affordable Care Act. Finally, we propose a minor editorial change in § 484.240(b) to specify that the outlier threshold for each case-mix group is the episode payment amount for that group, or the PEP adjustment amount for the episode, plus a fixed dollar loss amount that is the same for all case-mix groups.

Second, in addition to the proposed changes to the regulations text pertaining to outlier payments under the HH PPS, we also propose to amend § 409.43(e)(iii) and to add language to § 484.205(d) to clarify the frequency of review of the plan of care and the provision of Partial Episode Payments (PEP) under the HH PPS as a result of a regulations text change in § 424.22(b) that was finalized in the CY 2015 HH PPS final rule (79 FR 66032). Specifically, we propose to change the definition of an intervening event to include transfers and instances where a patient is discharged and return to home health during a 60-day episode, rather than a discharge and return to the same HHA during a 60-day episode. In § 484.220, we propose to update the regulations text to reflect the downward adjustments to the 60-day episode payment rate due to changes in the coding or classification of different units of service that do not reflect real changes in case-mix (nominal case-mix growth) applied to calendar years 2012 and 2013, which were finalized in the CY 2012 HH PPS final rule (76 FR 68532). This also includes updating the CY 2011 adjustment to 3.79 percent as finalized in the CY 2011 HH PPS final rule (75 FR 70461). In § 484.225 we are proposing to eliminate references to outdated market basket index factors by removing paragraphs (b), (c), (d), (e), (f) and (g). In § 484.230 we propose to delete the last sentence as a result of a change from a separate LUPA add-on amount to a LUPA add-on factor finalized in the CY 2014 HH PPS final rule (78 FR 72256). Finally, we are deleting and reserving § 484.245 as we believe that this language is no longer applicable under the HH PPS, as it was meant to facilitate the transition to the original PPS established in CY 2000.

Lastly, we propose to make one technical correction in § 424.22 to re-designate paragraph (a)(1)(v)(B)(1) as (a)(2).

We invite comments on these technical corrections and associated changes in the regulations at § 409, § 424, and § 484.

IV. Proposed Home Health Value-Based Purchasing (HHVBP) Model

A. Background

In the CY 2015 Home Health Prospective Payment System (HH PPS) final rule titled “Medicare and Medicaid Programs; CY 2015 Home Health Prospective Payment System Rate Update; Home Health Quality Reporting Requirements; and Survey and Enforcement Requirements for Home Health Agencies (79 FR 66032-66118), we indicated that we were considering the development of a home health value-based purchasing (HHVBP) model. We sought comments on a future HHVBP model, including elements of the model; size of the payment incentives and percentage of payments that would need to be placed at risk in order to spur home health agencies (HHAs) to make the necessary investments to improve the quality of care for Medicare beneficiaries; the timing of the payment adjustments; and, how performance payments should be distributed. We also sought comments on the best approach for selecting states for participation in this model. We noted that if the decision was made to move forward with the implementation of a HHVBP model in CY 2016, we would solicit additional comments on a more detailed model proposal to be included in future rulemaking.

In the CY 2015 HH PPS final rule, we indicated that we received a number of comments related to the magnitude of the percentage payment adjustments; evaluation criteria; payment features; a beneficiary risk adjustment strategy; state selection methodology; and the approach to selecting Medicare-certified HHAs. A number of commenters supported the development of a value-based purchasing model in the home health industry in whole or in part with consideration of the design parameters provided. No commenters provided strong counterpoints or alternative design options which dissuaded CMS from moving forward with general design and framework of the HHVBP model as discussed in the CY 2015 HH PPS proposed rule. All comments were considered in our decision to develop an HHVBP model for implementation beginning January 1, 2016. Therefore, in this proposed rule, we are proposing to implement a HHVBP model, which includes a randomized state selection methodology; the reporting framework; the payment adjustment methodology; payment adjustment schedule by performance year and payment adjustment percentage; the quality measures selection methodology, classifications and weighting, measures for performance year one, including the reporting of New Measures, and the framework for proposing to adopt measures for subsequent performance years; the performance scoring methodology, which includes performance based on achievement and improvement; the review and recalculation period; and the evaluation framework.

Medicare and Medicaid Programs; CY 2015 Home Health Prospective Payment System Rate Update; Home Health Quality Reporting Requirements; and Survey and Enforcement Requirements for Home Health Agencies, 79 FR 66105-66106 (November 6, 2014).

The basis for developing this proposed value-based purchasing (VBP) model, as described in the proposed regulations at § 484.300 et seq., stems from several important areas of consideration. First, we expect that tying quality to payment through a system of value-based purchasing will improve the beneficiaries' experience and outcomes. In turn, we expect payment adjustments that both reward improved quality and penalize poor performance will incentivize quality improvement and encourage efficiency, leading to a more sustainable payment system.

Second, section 3006(b) of the Affordable Care Act directed the Secretary of the Department of Health and Human Services (the Secretary) to develop a plan to implement a VBP program for payments under the Medicare Program for HHAs and the Secretary issued an associated Report to Congress in March of 2012 (2012 Report). The 2012 Report included a roadmap for implementation of an HHVBP model and outlined the need to develop an HHVBP program that aligns with other Medicare programs and coordinates incentives to improve quality. The 2012 Report also indicated that a HHVBP program should build on and refine existing quality measurement tools and processes. In addition, the 2012 Report indicated that one of the ways that such a program could link payment to quality would be to tie payments to overall quality performance.

CMS, “Report to Congress: Plan to Implement a Medicare Home Health Agency Value-Based Purchasing Program” (March 15, 2012) available at http://www.cms.gov/Medicare/Medicare-Fee-for-Service-Payment/HomeHealthPPS/downloads/stage-2-NPRM.PDF.

Third, section 402(a)(1)(A) of the Social Security Amendments of 1967 (as amended) (42 U.S.C. 1395b-1(a)(1)(A)), provided authority for us to conduct the Home Health Pay-for-Performance (HHPFP) Demonstration that ran from 2008 to 2010. The results of that Demonstration found modest quality improvement in certain measures after comparing the quality of care furnished by Demonstration participants to the quality of care furnished by the control group. One important lesson learned from the HHPFP Demonstration was the need to link the HHA's quality improvement efforts and the incentives. HHAs in three of the four regions generated enough savings to have incentive payments in the first year of the Demonstration, but the size of payments were unknown until after the conclusion of the Demonstration. Also, the time lag between quality performance and payment incentives was too long to provide a sufficient motivation for HHAs to take necessary steps to improve quality. The results of the Demonstration published in a comprehensive evaluation report suggest that future models could benefit from ensuring that incentives are reliable enough, of sufficient magnitude, and paid in a timely fashion to encourage HHAs to be fully engaged in the quality of care initiative.

“CMS Report on Home Health Agency Value-Based Purchasing Program” (February of 2012) available at https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/Reports/Downloads/HHP4P_Demo_Eval_Final_Vol1.pdf.

Furthermore, the President's FY 2015 and 2016 Budgets proposed that VBP should be extended to additional providers including skilled nursing facilities, home health agencies, ambulatory surgical centers, and hospital outpatient departments. The FY 2015 Budget called for at least 2 percent of payments to be tied to quality and efficiency of care on a budget neutral basis. The FY 2016 Budget outlines a program which would tie at least 2 percent of Medicare payments to the quality and efficiency of care in the first 2 years of implementation beginning in 2017, and at least 5 percent beginning in 2019 without any impact to the budget. We propose in this HHVBP model to also follow a graduated payment adjustment strategy within certain selected states beginning January 1, 2016.

The Secretary has also set two overall delivery system reform goals for CMS. First, we seek to tie 30 percent of traditional, or fee-for-service, Medicare payments to quality or value-based payments through alternative payment models by the end of 2016, and to tie 50 percent of payments to these models by the end of 2018. Second, we seek to tie 85 percent of all traditional Medicare payments to quality or value by 2016 and 90 percent by 2018. To support these efforts the Health Care Payment Learning and Action Network was recently launched to help advance the work being done across sectors to increase the adoption of value-based payments and alternative payment models. We believe that testing the HHVBP model would support these goals.

Content of this announcement can be found at http://www.hhs.gov/news/press/2015pres/01/20150126a.html.

Finally, we have already successfully implemented the Hospital Value-Based Purchasing (HVBP) program, under which value-based incentive payments are made in a fiscal year to hospitals that meet performance standards established for a performance period with respect to measures for that fiscal year. The percentage of a participating hospital's base-operating DRG payment amount for FY 2015 discharges that is at risk, based on the hospital's performance under the program for that fiscal year, is 1.5 percent. That percentage will increase to 2.0 by FY 2017. We are proposing an HHVBP model that builds on the lessons learned and guidance from the HVBP program and other applicable demonstrations as discussed above, as well as from the evaluation report discussed earlier.

The proposed HHVBP model presents an opportunity to improve the quality of care furnished to Medicare beneficiaries and study what incentives are sufficiently significant to encourage HHAs to provide high quality care. The HHVBP model being proposed would offer both a greater potential reward for high performing HHAs as well as a greater potential downside risk for low performing HHAs. If implemented, the model would begin on January 1, 2016, and include an array of measures that would capture the multiple dimensions of care that HHAs furnish.

The proposed model would be tested by CMS's Center for Medicare and Medicaid Innovation (CMMI) under section 1115A of the Act. Under section 1115A(d)(1) of the Act, the Secretary may waive such requirements of Titles XI and XVIII and of sections 1902(a)(1), 1902(a)(13), and 1903(m)(2)(A)(iii) as may be necessary solely for purposes of carrying out section 1115A with respect to testing models described in section 1115A(b). The Secretary is not issuing any waivers of the fraud and abuse provisions in sections 1128A, 1128B, and 1877 of the SSA or any other Medicare or Medicaid fraud and abuse laws for this model. Thus, notwithstanding any other provisions of this proposed rule, all providers and suppliers participating in the HHVBP model must comply with all applicable fraud and abuse laws and regulations.

We are proposing to use the section 1115A(d)(1) waiver authority to apply a reduction or increase of up to 8 percent to current Medicare payments to Medicare-certified HHAs delivering care to beneficiaries within the boundaries of certain states, depending on the HHA's performance on specified quality measures relative to its peers. Specifically, the HHVBP model proposes to utilize the waiver authority to adjust Medicare payment rates under section 1895(b) of the Act. In accordance with the authority granted to the Secretary in section 1115A(d)(1) of the Act, we would waive section 1895(b)(4) of the Act only to the extent necessary to adjust payment amounts to reflect the value-based payment adjustments under this proposed model for Medicare-certified HHAs in specified states selected in accordance with CMS's proposed selection methodology. We are not proposing to implement this model under the authority granted by the Affordable Care Act under section 3131 (“Payment Adjustments for Home Health Care”).

The defined population would include all Medicare beneficiaries being provided care by any Medicare-certified HHA delivering care within the selected states. Medicare-certified HHAs that are delivering care within the boundaries of selected states are considered `Competing Medicare-certified Home Health Agencies' within the scope of this HHVBP Model. If care is delivered outside of boundaries of selected states, or inside the boundaries of a non-selected state that does not have a reciprocal agreement with a selected state, payments for those beneficiaries would not be considered within the scope of the model because we are basing participation in the model on state specific CMS Certification Numbers (CCNs). Payment adjustments for each year of the model would be calculated based on a comparison of how well each competing Medicare-certified HHA performed during the performance period for that year (proposed below to be one year in length, starting in CY 2016) with its performance on the same measures in 2015 (proposed below to be the baseline data year).

The first performance year would be CY 2016, the second would be CY 2017, the third would be CY 2018, the fourth would be 2019, and the fifth would be CY 2020. Greater details on performance periods are outlined in further detail in section D—Performance Assessment and Payment Periods. This model would test whether being subject to significant payment adjustments to the Medicare payment amounts that would otherwise be made to competing Medicare-certified HHAs would result in statistically significant improvements in the quality of care being delivered to this specific population of Medicare beneficiaries.

We propose to identify Medicare-certified HHAs for participation in this model using state borders as boundaries. We do so under the authority granted in section 1115A(a)(5) of the Act to elect to limit testing of a model to certain geographic areas. This decision is influenced by the 2012 Report to Congress mandated under section 3006(b) of the Affordable Care Act. This Report stated that HHAs which participated in previous value-based purchasing demonstrations “uniformly believed that all Medicare-certified HHAs should be required to participate in future VBP programs so all agencies experience the potential burdens and benefits of the program” and some HHAs expressed concern that absent mandatory participation, “low-performing agencies in areas with limited competition may not choose to pursue quality improvement.”

See the Recommendations section of the U.S. Department of Health and Human Services. Report to Congress: Plan to Implement a Medicare Home Health Agency Value-Based Purchasing Program.” (March 2012) p. 28.

Section 1115A(b)(2)(A) of the Act requires that the Secretary select models to be tested where the Secretary determines that there is evidence that the model addresses a defined population for which there are deficits in care leading to poor clinical outcomes or potentially avoidable expenditures. The HHVBP model was developed to improve care for Medicare patients receiving care from HHAs based on evidence in the March 2014 MedPAC Report to Congress citing quality and cost concerns in the home health sector. According to MedPAC, “about 29 percent of post-hospital home health stays result in readmission, and there is tremendous variation in performance among providers within and across geographic regions.” The same report cited limited improvement in quality based on existing measures, and noted that the data on quality “are collected only for beneficiaries who do not have their home health care stays terminated by a hospitalization,” skewing the results in favor of a healthier segment of the Medicare population. This model would test the use of adjustments to Medicare HH PPS rates by tying payment to quality performance with the goal of achieving the highest possible quality and efficiency.

See full citation at note 11. MedPAC Report to Congress (March 2014) p.215.

MedPAC Report to Congress (March 2014) p.226.

B. Overview

In § 484.305 we propose definitions for “applicable percent”, “applicable measure”, “benchmark”, “home health prospective payment system”, “larger-volume cohort”, “linear exchange function”, “Medicare-certified home health agency”, “New Measures”, “payment adjustment”, “performance period”, “smaller-volume cohort”, “selected states”, “starter set”, “Total Performance Score”, and “value-based purchasing” as they pertain to this subpart. The HHVBP model is being proposed to encompass five performance years and be implemented beginning January 1, 2016 and conclude on December 31, 2022. Payment and service delivery models are developed by CMMI in accordance with the requirements of section 1115A of the Act. During the development of new models, CMMI builds on the ideas received from internal and external stakeholders and consults with clinical and analytical experts.

In this proposed rule, we are outlining an HHVBP model for public notice and comment that has an overall purpose of improving the quality of home health care and delivering it to the Medicare population in a more efficient manner. The specific goals of the proposed model are to:

1. Incentivize HHAs to provide better quality care with greater efficiency;

2. Study new potential quality and efficiency measures for appropriateness in the home health setting; and,

3. Enhance current public reporting processes.

We are proposing that the HHVBP model would adjust Medicare HHA payments over the course of the model by up to 8 percent depending on the applicable performance year and the degree of quality performance demonstrated by each competing Medicare-certified HHA. The proposed model would reduce the HH PPS final claim payment amount to an HHA for each episode in a calendar year by an amount up to the applicable percentage defined in proposed § 484.305. The timeline of payment adjustments as they apply to each performance year is described in greater detail in the section entitled “Payment Adjustment Timeline.”

The model would apply to all Medicare-certified HHAs in each of the selected states, which means that all HHAs in the selected states would be required to compete. We propose to codify this policy at 42 CFR 484.310. Furthermore, a competing Medicare-certified HHA would only be measured on performance for care delivered to Medicare beneficiaries within selected states (with rare exceptions given for care delivered when a reciprocal agreement exists between states). The distribution of payment adjustments would be based on quality performance, as measured by both achievement and improvement, across a proposed set of quality measures rigorously constructed to minimize burden as much as possible and improve care. Competing Medicare-certified HHAs that demonstrate they can deliver higher quality of care in comparison to their peers (as defined by the volume of services delivered within the selected state), or their own past performance, could have their payment for each episode of care adjusted higher than the amount that otherwise would be paid under section 1895 of the Act. Competing Medicare-certified HHAs that do not perform as well as other competing Medicare-certified HHAs of the same size in the same state might have their payments reduced and those competing Medicare-certified HHAs that perform similarly to others of similar size in the same state might have no payment adjustment made. This operational concept is similar in practice to what is used in the HVBP program.

We expect that the risk of having payments adjusted in this manner would provide an incentive among all competing Medicare-certified HHAs delivering care within the boundaries of selected states to provide significantly better quality through improved planning, coordination, and management of care. The degree of the payment adjustment would be dependent on the level of quality achieved or improved from the baseline year, with the highest upward performance adjustments going to competing Medicare-certified HHAs with the highest overall level of performance based on either achievement or improvement in quality. The size of a Medicare-certified HHA's payment adjustment for each year under the model would be dependent upon that HHA's performance with respect to that calendar year relative to other competing Medicare-certified HHAs of similar size in the same state and relative to its own performance during the baseline year.

We are proposing that states would be selected randomly from nine regional groupings for model participation. A competing Medicare-certified HHA is only measured on performance for care delivered to Medicare beneficiaries within boundaries of selected states and only payments for HHA services provided to Medicare beneficiaries within boundaries of selected states would be subject to adjustment under the proposed model. Requiring all Medicare-certified HHAs within the boundaries of selected states to compete in the model would ensure that: (1) There is no self-selection bias, (2) competing HHAs are representative of HHAs nationally, and (3) there is sufficient participation to generate meaningful results. We believe it is necessary to require all HHAs delivering care within boundaries of selected states to be included in the model because, in our experience, Medicare-providers are generally reluctant to participate voluntarily in models in which their Medicare payments could be subject to possible reduction. This reluctance to participate in voluntary models has been shown to cause self-selection bias in statistical assessments and thus, may present challenges to our ability to evaluate the model. In addition, state boundaries represent a natural demarcation in how quality is currently being assessed through OASIS measures on Home Health Compare (HHC).

C. Selection Methodology

1. Identifying a Geographic Demarcation Area

We are proposing to adopt a methodology that uses state borders as boundaries for demarcating which Medicare-certified HHAs will be required to compete in the model. We are proposing to select nine states from nine geographically-defined groupings of five or six states. Groupings were also defined in order to ensure that the successful implementation of the model would produce robust and generalizable results, as discussed later in this section.

We took into account five key factors when deciding to propose selection at the state-level for this model. First, if we required some, but not all, Medicare-certified HHAs that deliver care within the boundaries of a selected state to participate in the model, we believe the HHA market for the state could be disrupted because HHAs in the model would be competing against HHAs not in the model (herein referenced as either `non-model HHAs' or `non-competing HHAs'). Second, we wanted to ensure that the distribution of payment adjustments based on performance under the model could be extrapolated to the entire country. Statistically, the larger the sample to which payment adjustments are applied, the smaller the variance of the sampling distribution and the greater the likelihood that the distribution accurately predicts what would transpire if the methodology were applied to the full population of HHAs. Third, we considered the need to align with other HHA quality program initiatives including HHC. The HHC Web site presently provides the public and HHAs a state- and national-level comparison of quality. We expect that aligning performance with the HHVBP benchmark and the achievement score would support how measures are currently being reported on HHC. Fourth, there is a need to align with CMS regulations which require that each HHA have a unique CMS Certification Number (CCN) for each state in which the HHA provides service. Fifth, we wanted to ensure sufficient sample size and the ability to meet the rigorous evaluation requirements for CMMI models. These five factors are important for the successful implementation and evaluation of this model.

We expect that when there is a risk for a downside payment adjustment based on quality performance measures, the use of a self-contained, mandatory cohort of HHA participants will create a stronger incentive to deliver greater quality among competing Medicare-certified HHAs. Specifically, it is possible the market would become distorted if non-model HHAs are delivering care within the same market as competing Medicare-certified HHAs because competition, on the whole, becomes unfair when payment is predicated on quality for one group and volume for the other group. In addition, we expect that evaluation efforts might be negatively impacted because some HHAs would be competing on quality and others on volume within the same market.

We are proposing the use of state boundaries after careful consideration of several alternative selection approaches, including randomly selecting HHAs from all HHAs across the country, and requiring participation from smaller geographic regions including the county; the Combined Statistical Area (CSA); the Core-Based Statistical Area (CBSA); rural provider level; and the Hospital Referral Region (HRR) level.

A methodology using a national sample of HHAs that are randomly selected from all HHAs across the country could be designed to include enough HHAs to ensure robust payment adjustment distribution and a sufficient sample size for the evaluation; however, this approach may present significant limitations when compared with the state boundaries selection methodology proposed in this model. Of primary concern with randomly selecting at the provider-level across the nation is the issue with market distortions created by having competing Medicare-certified HHAs operating in the same market as non-model HHAs.

Using smaller geographic areas than states, such as counties, CSAs, CBSAs, rural, and HRRs, could also present challenges for this model. These smaller geographic areas were considered as alternate selection options; however, their use could result in too small of a sample size of potential competing HHAs. As a result, we expect the distribution of payment adjustments could become highly divergent among fewer HHA competitors. In addition, the ability to evaluate the model could become more complex and may be less generalizable to the full population of Medicare-certified HHAs and the beneficiaries they serve across the nation. Further, the use of smaller geographic areas than states could increase the proportion of Medicare-certified HHAs that could fall into groupings with too few agencies to generate a stable distribution of payment adjustments. Thus, if we were to define geographic areas based on CSAs, CBSAs, counties, or HRRs, we would need to develop an approach for consolidating smaller regions into larger regions.

Home health care is a unique type of health care service when compared to other Medicare provider types. In general, the HHA's care delivery setting is in the beneficiaries' homes as opposed to other provider types that traditionally deliver care at a brick and mortar institution within beneficiaries' respective communities. As a result, the HHVBP model needs to be designed to account for the unique way that HHA care is provided in order to ensure that the results are generalizable to the population. HHAs are limited to providing care to beneficiaries in the state that they have a CCN however; HHAs are not restricted from providing service in a county, CSA, CBSA or HRR that they are not located in (as long as the other county/CBSA/HRR is in the same state in which the HHA is certified). As a result, using smaller geographic areas (than state boundaries) could result in similar market distortion and evaluation confounders as selecting providers from a randomized national sampling. The reason is that HHAs in adjacent counties/CSAs/CBSAs/HRRs may not be in the model but, would be directly competing for services in the same markets or geographic regions. Competing HHAs delivering care in the same market area as non-competing HHAs could generate a spillover effect where non-model HHAs would be vying for the same beneficiaries as competing HHAs. This spillover effect presents several issues for evaluation as the dependent variable (quality) becomes confounded by external influences created by these non-competing HHAs. These unintentional external influences on competing HHAs may be made apparent if non-competing HHAs become incentivized to generate greater volume at the expense of quality delivered to the beneficiaries they serve and at the expense of competing HHAs that are paid on quality instead of volume. Further, the ability to extrapolate these results to the full population of HHAs and the beneficiaries they serve becomes confounded by an artifact of the model and inferences would be limited from an inability to duplicate these results. While these concerns would decrease in some order of magnitude as larger regions are considered, the only way to eliminate these concerns entirely is to define participation among Medicare-certified HHAs at the state level.

In addition, home health quality data currently displayed on HHC allows users to compare HHA services furnished within a single state. Selecting HHAs using other geographic regions that are smaller and/or cross state lines could require the model to deviate from the established process for reporting quality. For these reasons, we believe a selection methodology based on the use of Medicare-certified HHAs delivering care within state boundaries would be the most appropriate for the successful implementation and evaluation of this model.

While, for the reasons described above, we are proposing that the geographic basis of selection remain at the state-level, we nevertheless seek comment on potential alternatives that might use smaller geographic areas. With consideration of alternatives, the public should reference the five aforementioned key factors used to consider selection at the state-level for this model as they relate to the evaluative framework and operational feasibility of this model. In particular, one potential alternative would be to split states into sub-state regions using a combination of CSAs and metropolitan statistical areas (MSA), a type of CBSA. For example, regions might be defined using the following process:

  • Step 1: Define one sub-state region corresponding to each CSA that contains an MSA (but not for CSAs that do not include an MSA) and one sub-state region corresponding to each MSA that is not part of a CSA. In cases where a CSA or MSA crossed state boundaries, only the portion of the CSA or MSA that falls inside the state boundaries would be included in the sub-state region.
  • Step 2: Any portions of a state that were not included in a sub-state region based on a CSA or an MSA defined in Step 1 would be consolidated in a single “remainder of state” sub-state region.
  • Step 3: To ensure that all sub-state regions have a sufficient number of HHAs to permit stable distribution of payment adjustments, sub-state regions based on CSAs or MSAs that contained fewer than 25 HHAs would be consolidated into the “remainder of state” sub-state region.
  • Step 4: If a “remainder of state” sub-state region had fewer than 25 HHAs, that sub-state region would be consolidated with the geographically closest sub-state region based on a CSA or MSA.

We note that algorithms like this one may generate more than 100 total sub-state regions and over 200 unique competing cohorts of Medicare-certified HHAs.

We seek comment on advantages and disadvantages of this approach relative to defining regions based on state boundaries. In particular, we note that because this approach would generate a larger number of regions, it could increase the statistical power of the model evaluation, and might improve our ability to determine what effects the model has on the quality of home health care, as well as other outcomes of interest. However, we note that because regions would no longer line up with full states in most cases, the regions selected to participate in the model would no longer align directly with those displayed on HHC and therefore, quality data would have to be recalculated and displayed differently from what is currently being reported on HHC. In addition, using sub-state regions could, as noted above, lead to undesirable spillover effects between participating and non-participating HHAs. These spillover concerns would be mitigated by the fact that none of the sub-state regions defined under this approach would cross state lines and the fact that the sub-state regions would be larger than under some approaches to defining sub-state regions (for example, at the county level). Nevertheless, it is unclear how severe these evaluation and operational concerns would be in practice and how the extent of these concerns would depend on the different characteristics of the selected regions. We welcome public comment on these proposed state selection methodologies.

2. Overview of the Randomized Selection Methodology for States

We are requesting comments on the following proposed methodology for selecting states. The selection methodology employed will need to provide the strongest evidence of producing meaningful results representative of the national population of Medicare-certified HHAs and, in turn, meet the evaluation requirements of section 1115A(b)(4) of the Act.

The state selections listed in proposed § 484.310 are based on the described proposed randomized selection methodology and are subject to change in the CY 2016 HH PPS final rule as a result of any changes that may be made to the proposed randomized methodology in response to comments. However, if the final methodology differs from what we are proposing here, we will apply the final methodology and identify the states selected under the final methodology in the final rule. We propose to group states by each state's geographic proximity to one another and by accounting for key evaluation characteristics (that is, proportionality of service utilization, proportionality of organizations with similar tax-exempt status and HHA size, and proportionality of beneficiaries that are dually-eligible for Medicare and Medicaid).

Based on an analysis of OASIS quality data and Medicare claims data, we believe the use of nine geographic groupings is necessary to ensure that the model accounts for the diversity of beneficiary demographics, rural and urban status, cost and quality variations, among other criteria. To provide for comparable and equitable selection probabilities, these separate geographic groupings each include a comparable number of states. We are not proposing to adopt census-based geographic groupings or the CMS Medicare Administrative Contractor (MAC) jurisdictions because those groupings would not permit an equal opportunity of selection of Medicare-certified HHAs by state or an assurance that we would be able test the model among a diversity of agencies such as is found across the nation. Following this logic, under our proposed methodology, groupings are based on states' geographic proximity to one another, having a comparable number of states if randomized for an equal opportunity of selection, and similarities in key characteristics that would be considered in the evaluation study because the attributes represent different types of HHAs, regulatory oversight, and types of beneficiaries served. This is necessary to ensure that the evaluation study remains objective and unbiased and that the results of this study best represent the entire population of Medicare-certified HHAs across the nation.

Several of the key characteristics we used for grouping state boundaries into clusters for selection into the model are also used in the impact analysis of our annual HHA payment updates, a fact that reinforces their relevance for evaluation. The additional proposed standards for grouping (level of utilization and socioeconomic status of patients) are also important to consider when evaluating the program, because of their current policy relevance. Large variations in the level of utilization of the home health benefit has received attention from policymakers concerned with achieving high-value health care and curbing fraud and abuse. Policymakers' concerns about the role of beneficiary-level characteristics as determinants of resource use and health care quality were highlighted in the Affordable Care Act, which mandated a study of access to home health care for vulnerable populations and, more recently, Improving Medicare Post-acute Care Transformation (IMPACT) Act of 2014 required the Secretary to study the relationship between individuals' socioeconomic status and resource use or quality. The parameters used to define each geographic grouping are further described in the next three sections.

See MedPAC Report to Congress: Medicare Payment Policy (March 2014, Chapter 9) available at http://medpac.gov/documents/reports/mar14_entirereport.pdf. See also the Institute of Medicine Interim Report of the Committee on Geographic Variation in Health Care Spending and Promotion of High-Value Health Care: Preliminary Committee Observations (March 2013) available at http://iom.edu/Reports/2013/Geographic-Variation-in-Health-Care-Spending-and-Promotion-of-High-Care-Value-Interim-Report.aspx.

Section 3131(d) of the Affordable Care Act.

Improving Medicare Post-acute Care Transformation (IMPACT) Act of 2014 (Public Law 113-185).

a. Geographic Proximity

Under the proposed methodology, in order to ensure that the Medicare-certified HHAs that would be required to participate in the model are not all in one region of the country, the states in each grouping are adjacent to each other whenever possible while creating logical groupings of states based on common characteristics as described above. Specifically, analysis based on quality data and claims data found that HHAs in these neighboring states tend to hold certain characteristics in common. These include having similar; patterns of utilization, proportionality of non-profit agencies, and types of beneficiaries served (for example, severity and number, type of co-morbidities, and socio-economic status). Therefore, the proposed groupings of states are delineated according to states' geographic proximity to one another and common characteristics as a means of permitting greater comparability. In addition, each of the groupings retains similar types of characteristics when compared to any other type of grouping of states.

b. Comparable Number of States in Each Grouping

Under our proposed randomized selection methodology, each geographic region, or grouping, has a similar number of states. As a result, all states would have a 16.7 percent to 20 percent chance of being selected under our proposed methodology, and Medicare-certified HHAs would have a similar likelihood of being required to compete in the model by using this sampling design. We assert that this sampling design would ensure that no single entity is singled out for selection, since all states and Medicare-certified HHAs would have approximately the same chance of being selected. In addition, this sampling approach would mitigate the opportunity for HHAs to self-select into the model and thereby bias any results of the test.

c. Characteristics of State Groupings

Without sacrificing an equal opportunity for selection, the proposed state groupings are intended to ensure that important characteristics of Medicare-certified HHAs that deliver care within state boundaries can be used to evaluate the primary intervention with greater generalizability and representativeness of the entire population of Medicare-certified HHAs in the nation. Data analysis of these characteristics employed the full data set of Medicare claims and OASIS quality data. Although some characteristics, such as beneficiary age and case-mix, yield some variations from one state to another, other important characteristics do vary substantially and could influence how HHAs respond to the incentives of the model. Specifically, home health services utilization rates, tax-exemption status of the provider, the socioeconomic status of beneficiaries (as measured by the proportion of dually-eligible beneficiaries), and agency size (as measured by average number of episodes of care per HHA), are important characteristics that could influence outcomes of the model. Subsequently, we intend to study the impacts of these characteristics for purposes of designing future value-based purchasing models and programs. These characteristics and expected variations must be considered in the evaluation study to enable us to avoid erroneous inferences about how different types of HHAs will respond to HHVBP incentives.

Under this proposed state selection methodology, state groupings reflect regional variations that enhance the generalizability of the model. In line with this methodology, each grouping includes states that are similar in at least one important aforementioned characteristic while being geographically located in close proximity to one another. Using the criteria described above, the following geographic groupings were identified using Medicare claims-based data from calendar years 2013-2014. Each of the 50 states was assigned to one of the following geographic groups:

  • Group #1: (VT, MA, ME, CT, RI, NH)

States in this group tend to have larger HHAs and have average utilization relative to other states.

  • Group #2: (DE, NJ, MD, PA, NY)

States in this group tend to have larger HHAs, have lower utilization, and provide care to an average number of dually-eligible beneficiaries relative to other states.

  • Group #3: (AL, GA, SC, NC, VA)

States in this group tend to have larger HHAs, have average utilization rates, and provide care to a high proportion of minorities relative to other states.

  • Group #4: (TX, FL, OK, LA, MS)

States in this group have HHAs that tend to be for-profit, have very high utilization rates, and have a higher proportion of dually-eligible beneficiaries relative to other states.

  • Group #5: (WA, OR, AK, HI, WY, ID)

States in this group tend to have smaller HHAs, have average utilization rates, and are more rural relative to other states.

  • Group #6: (NM, CA, NV, UT, CO, AZ)

States in this group tend to have smaller HHAs, have average utilization rates, and provide care to a high proportion of minorities relative to other states.

  • Group #7: (ND, SD, MT, WI, MN, IA)

States in this group tend to have smaller HHAs, have very low utilization rates, and are more rural relative to other states.

  • Group #8: (OH, WV, IN, MO, NE., KS)

States in this group tend to have HHAs that are of average size, have average utilization rates, and provide care to a higher proportion of dually-eligible beneficiaries relative to other states.

  • Group #9: (IL, KY, AR, MI, TN)

States in this group tend to have HHAs with higher utilization rates relative to other states.

d. Randomized Selection of States

Upon the careful consideration of the aforementioned alternative selection methodologies, including selecting states on a non-random basis, we choose to propose the use of a selection methodology based on a randomized sampling of states within each of the nine regional groupings described above. We examined data on the evaluation elements listed in this section to determine if specific states could be identified in order to fulfill the needs of the evaluation. After careful review, we determined that each evaluation element could be measured by more than one state. As a result, we determined that it was necessary to apply a fair method of selection where each state would have a comparable opportunity of being selected and which would fulfill the need for a robust evaluation. The proposed nine groupings of states as described in this section permit the model to capture the essential elements of the evaluation including demographic, geographic, and market factors.

The randomized sampling of states is without bias to any characteristics of any single state within any specific regional grouping, where no states are excluded, and no state appears more than once across any of the groupings. The randomized selection of states was completed using a scientifically-accepted computer algorithm designed for randomized sampling. The randomized selection of states was run on each of the previously described regional groupings using exactly the same process and, therefore, reflects a commonly accepted method of randomized sampling. This computer algorithm employs the aforementioned sampling parameters necessary to define randomized sampling and omits any human interaction once it runs.

Based on this sampling methodology, SAS Enterprise Guide (SAS EG) 5.1 software was used to run a computer algorithm designed to randomly select states from each grouping. SAS EG 5.1 and the computer algorithm were employed to conduct the randomized selection of states. SAS EG 5.1 represents an industry-standard for generating advanced analytics and provided a rigorous, standardized tool by which to satisfy the requirements of randomized selection. The key SAS commands employed include a “PROC SURVEYSELECT” statement coupled with the “METHOD=SRS” option used to specify simple random sampling as the sample selection method. A random number seed was generated by using the time of day from the computer's clock. The random number seed was used to produce random number generation. Note that no stratification was used within any of the nine geographically-diverse groupings to ensure there is an equal probability of selection within each grouping. For more information on this procedure and the underlying statistical methodology, please reference SAS support documentation at: http://support.sas.com/documentation/cdl/en/statug/63033/HTML/default/viewer.htm#statug_surveyselect_sect003.htm/.

In § 484.310, we propose to codify the names of the states selected utilizing this proposed methodology, where one state from each of the nine groupings was selected. For each of these groupings, we propose to use state borders to demarcate which Medicare certified HHAs would be required to compete in this model: Massachusetts was randomly selected from Group 1, Maryland was randomly selected from Group 2, North Carolina was randomly selected from Group 3, Florida was randomly selected from Group 4, Washington was randomly selected from Group 5, Arizona was randomly selected from Group 6, Iowa was randomly selected from Group 7, Nebraska was randomly selected from Group 8, and Tennessee was randomly selected from Group 9. Thus, if our methodology is finalized as proposed, all Medicare-certified HHAs that provide services in Massachusetts, Maryland, North Carolina, Florida, Washington, Arizona, Iowa, Nebraska, and Tennessee will be required to compete in this model.

However, should the methodology we propose in this rule change as a result of comments received during the rulemaking process, it could result in different states being selected for the model. In such an event, we would apply the final methodology and announce the selected states in the final rule. We therefore seek comment from all interested parties in every state on the randomized selection methodology proposed above and codified at § 484.310.

Based on the comments received from this proposed rule, the selection methodology for participation in the model may change from state boundaries to an approach based on sub-state regions built from CSAs/MSAs, CBSAs, rural provider level or HRRs. In that case, the goals of the model will remain the same, and therefore, we would expect to take a broadly similar approach to selecting participating regions to the approach that would be taken when regions are defined based on state boundaries. Specifically, as with the selection methodology outlined above, we would anticipate grouping sub-state regions together based on geographic proximity and other characteristics into groups of approximately equal size and then selecting some number of sub-state regions to participate from each group. The number of selected participants will be dependent on the selection methodology. We welcome public comment on these proposed state selection methodologies.

e. Use of CMS Certification Numbers (CCNs)

We are proposing that Total Performance Scores (TPS) and payment adjustments would be calculated based on an HHA's CCN and, therefore, based only on services provided in the selected states. The exception to this methodology is where an HHA provides service in a state that also has a reciprocal agreement with another state. Services being provided by the HHA to beneficiaries who reside in another state would be included in the TPS and subject to payment adjustments. The reciprocal agreement between states allows for an HHA to provide services to a beneficiary across state lines using its original CCN number. Reciprocal agreements are rare and, as identified using the most recent Medicare claims data from 2014, there was found to be less than 0.1 percent of beneficiaries that provided services that were being served by CCNs with reciprocal agreements across state lines. Due to the very low number of beneficiaries served across state borders as a result of these agreements, we expect there to be an inconsequential impact if we were to include these beneficiaries in the model.

HHAs are required to report OASIS data and any other quality measures by its own unique CMS Certification Number (CCN) as defined under Title 42, Chapter IV, Subchapter G, Part § 484.20 Available at URL http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title42/42cfr484_main_02.tpl .

See Chapter 2 of the State Operations Manual (SOM), Section 2184—Operation of HHAs Cross State Lines, stating “When an HHA provides services across State lines, it must be certified by the State in which its CCN is based, and its personnel must be qualified in all States in which they provide services. The appropriate SA completes the certification activities. The involved States must have a written reciprocal agreement permitting the HHA to provide services in this manner.”

D. Performance Assessment and Payment Periods

1. Performance Reports

We are proposing the use of quarterly performance reports, annual payment adjustment reports, and annual publicly-available performance reports as a means of developing greater transparency of Medicare data on quality and aligning the competitive forces within the market to deliver care based on value over volume. The publicly-reported reports would inform home health industry stakeholders (consumers, physicians, hospitals) as well as all competing HHAs delivering care to Medicare beneficiaries within selected state boundaries on their level of quality relative to both their peers and their own past performance.

Competing HHAs would be scored for the quality of care delivered under the model based on their performance on measures compared to both the performance of their peers, defined by the same size cohort (either smaller- or larger-volume cohorts as defined in § 484.305), and their own past performance on the measures. We propose in § 484.305 to define larger-volume cohort to mean the group of Medicare-certified HHAs within the boundaries of a selected state that are participating in HHCAHPs in accordance with § 484.250 and to define smaller-volume cohort to mean the group of HHAs within the boundaries of a selected state that are exempt from participation in HHCAHPs in accordance with § 484.250. Where there are too few HHAs in the smaller-volume cohort in each state to compete in a fair manner (that is, when there is only one or two HHAs competing within a specific cohort), these specific HHAs would be included in the larger-volume cohort [for purposes of calculating the total performance score and payment adjustment] without being measured on HHCAHPS. We are requesting comments on this proposed methodology.

Quality performance scores and relative peer rankings would be determined through the use of a baseline year (calendar year 2015) and subsequent performance periods for each competing HHA. Further, these reports would provide competing HHAs with an opportunity to track their quality performance relative to their peers and their own past performance. Using these reports provides a convenient and timely means for competing HHAs to assess and track their own respective performance as capacity is developed to improve or sustain quality over time.

Beginning with the data collected during the first quarter of CY 2016 (that is, data for the period January 1, 2016 to March 31, 2016), and for every quarter of the model thereafter, we are proposing to provide each Medicare certified HHA with a quarterly report that contains information on their performance during the quarter. We expect to make the first quarterly report available in July 2016, and to make performance reports for subsequent quarters available in October, January and April. The final quarterly report would be made available in April 2021. The quarterly reports would include a competing HHA's model-specific performance results with a comparison to other competing HHAs within its cohort (larger- or smaller-volume) within the state boundary. These model-specific performance results would complement all quality data sources already being provided through the QIES system and any other quality tracking system possibly being employed by HHAs. We note that all performance measures that Medicare-certified HHAs will report through the QIES system are also already made available in the CASPER Reporting application. The primary difference between the two reports (CASPER reports and the model-specific performance report) is that the model-specific performance report we are proposing here consolidates the applicable performance measures used in the HHVBP model and provides a peer-ranking to other competing Medicare-certified HHAs within the same state and size-cohort. In addition, CASPER reports would provide quality data earlier than model-specific performance reports because CASPER reports are not limited by a quarterly run-out of data and a calculation of competing peer-rankings. For more information on the accessibility and functionality of the CASPER system, please reference the CASPER Provider Reporting Guide.

The model-specific quarterly performance report would be made available to each HHA through a dedicated CMMI model-specific platform for data dissemination and include each HHA's relative ranking amongst its peers along with measurement scores and overall performance rankings.

We are proposing that a separate payment adjustment report would be provided once a year to each of the competing HHAs. This report would focus primarily on the payment adjustment percentage and include an explanation of when the adjustment would be applied and how this adjustment was determined relative to performance scores. Each competing HHA would receive its own payment adjustment report viewable only to that HHA.

We are also proposing a separate, annual, publicly available quality report that would provide home health industry stakeholders, including providers and suppliers that refer their patients to HHAs, with an opportunity to ensure that the beneficiaries they are referring for home health services are being provided the best possible quality of care available. We seek public comment on the proposed reporting framework described above.

2. Payment Adjustment Timeline

We propose at § 484.325 that Medicare-certified HHAs will be subject to upward or downward payment adjustments based on performance on quality measures. We propose this model would consist of 5 performance years, where each performance year would link performance to the opportunity and risk for payment adjustment up to an applicable percent as defined in proposed 42 CFR 484.305. The first performance year would transpire from January 1, 2016 through December 31, 2016, and subsequently, all other performance years would be assessed on an annual basis through 2020, unless modified through rulemaking. The first payment adjustment would begin January 1, 2018 applied to that calendar year based on 2016 performance data. Subsequently, all other payment adjustments would be made on an annual basis through the conclusion of the model, unless modified through rulemaking. We are proposing that payment adjustments will be increased incrementally over the course of the model with a maximum payment adjustment of (5 percent) upward or downward in 2018 and 2019, a maximum payment adjustment of 6 percent (upward or downward) in 2020, and a maximum payment adjustment of 8 percent (upward or downward) in 2021 and 2022. We propose to implement this model over a total of 7 years beginning on January 1, 2016, and ending on December 31, 2022.

The baseline year would run from January 1, 2015 through December 31, 2015 and provide a basis from which each respective HHA's performance would be measured in each of the performance years. Data related to performance on quality measures would continue to be provided from the baseline year through the model's tenure using a dedicated HHVBP web-based platform specifically designed to disseminate data in this model (this “portal” would present and archive the previously described quarterly and annual quality reports). Further, HHAs will provide performance data on the four new quality measures through this platform as well. Any new measures employed through the model's tenure, subject to rulemaking, would use data from the previous calendar year as the baseline.

New market entries (specifically, new Medicare-certified HHAs delivering care in the boundaries of selected states) would also be measured from their first full calendar year of services in the state, which would be treated as baseline data for subsequent performance years under this model. The delivery of services would be measured by the number of episodes of care for Medicare beneficiaries and used to determine whether an HHA falls into the smaller- or larger- volume cohort. Furthermore, these new market entries would be competing under the HHVBP model in the first full calendar year following the full calendar year baseline period.

HHAs would be notified in advance of their first performance level and payment adjustment being finalized, based on the 2016 performance period (January 1, 2016 to December 31, 2016), with their first payment adjustment to be applied January 1, 2018 through December 31, 2018. Each HHA would be notified of this first pending payment adjustment on August 1, 2017 and a preview period would run for 10 days through August 11, 2017. This preview period would provide each competing HHA an opportunity to reconcile any performance assessment issues relating to the calculation of scores prior to the payment adjustment taking effect, in accordance with the process proposed in section H—Preview and Period to Request Recalculation. Once the preview period ends, any changes would be reconciled and a report finalized no later than November 1, 2017 (or 60 days prior to the payment adjustment taking affect).

Subsequent payment adjustments would be calculated based on the applicable full calendar year of performance data from the quarterly reports, with HHAs notified and payments adjusted, respectively, every year thereafter. As a sequential example, the second payment adjustment would occur January 1, 2019 based on a full 12 months of the CY 2017 performance period. Notification of the adjustment would occur on August 1, 2018, along with the preview period transpiring through August 11, 2018 and followed by reconciliation through September 10, 2018. Subsequent payment adjustments would continue to follow a similar timeline and process. We seek public comment on this payment adjustment schedule.

Beginning in CY 2019, we may consider revising this payment adjustment schedule and updating the payment adjustment more frequently than once each year if it is determined that a more timely application of the adjustment as it relates to performance improvement efforts that have transpired over the course of a calendar year would generate increased improvement in quality measures. Specifically, we would expect that having payment adjustments transpire closer together through more frequent performance periods would accelerate improvement in quality measures because HHAs would be able to justify earlier investments in quality efforts and be incentivized for improvements. In effect, this concept may be operationalized to create a smoothing effect where payment adjustments are based on overlapping 12-month performance periods that occur every 6 months rather than annually. As an example, the normal 12-month performance period occurring from January 1, 2020 to December 31, 2020 might have an overlapping 12-month performance period occurring from July 1, 2020 to June 30, 2021. Following the regularly scheduled January 1, 2022 payment adjustments, the next adjustments could be applied to payments beginning on July 1, 2022 through December 31, 2022. Depending on if and when more frequent payment adjustments would be applied, performance would be calculated based on the applicable 12-months of performance data, HHAs notified, and payments adjusted, respectively, every six months thereafter, until the conclusion of the model. As a result, separate performance periods would have a 6-month overlap through the conclusion of the model. HHAs would be notified through rulemaking and be given the opportunity to comment on any proposed changes to the frequency of payment adjustments. We seek public comment on the proposed payment adjustment schedule described above.

E. Quality Measures

1. Objectives

Initially, we propose the measures for the HHVBP model would be predominantly drawn from the current Outcome and Assessment Information Set (OASIS), which is familiar to the home health industry and readily available for utilization by the proposed model. In addition, the HHVBP model provides us with an opportunity to examine a broad array of quality measures that address critical gaps in care. A recent comprehensive review of the VBP experience over the past decade, sponsored by the Office of the Assistant Secretary for Planning and Evaluation (ASPE), identified several near- and long-term objectives for HHVBP measures. The recommended objectives emphasize measuring patient outcomes and functional status; appropriateness of care; and incentives for providers to build infrastructure to facilitate measurement within the quality framework. The following seven objectives derived from this study served as guiding principles for the selection of the proposed measures for the HHVBP model:

For detailed information on OASIS see the official CMS OASIS web resource available at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/OASIS/index.html?redirect=/oasis. See also industry resource available at http://www.oasisanswers.com/index.htm,, specifically updated OASIS component information available at www.oasisanswers.com/LiteratureRetrieve.aspx?ID=215074 ).

Id.

1. Use a broad measure set that captures the complexity of the HHA service provided;

2. Incorporate the flexibility to include Improving Medicare Post-Acute Care Transformation (IMPACT) Act of 2014 proposed measures that are cross-cutting amongst post-acute care settings;

3. Develop second-generation measures of patient outcomes, health and functional status, shared decision making, and patient activation;

4. Include a balance of process, outcome, and patient experience measures;

5. Advance the ability to measure cost and value;

6. Add measures for appropriateness or overuse; and,

7. Promote infrastructure investments.

2. Proposed Methodology for Selection of Quality Measures

a. Direct Alignment With National Quality Strategy Priorities

A central driver of the proposed measure selection process was incorporating innovative thinking from the field while simultaneously drawing on the most current evidence-based literature and documented best practices. Broadly, we propose measures that have a high impact on care delivery and support the combined priorities of HHS and CMS to improve health outcomes, quality, safety, efficiency, and experience of care for patients. To frame the selection process, we utilized the domains described in the CMS Quality Strategy that maps to the six National Quality Strategy (NQS) priority areas (see Figure 3 for CMS domains).

b. Referenced Quality Measure Authorities

We propose at § 484.315 that Medicare-certified HHAs would be evaluated using a starter set of quality measures (“starter set” refers to the proposed quality measures for the first year of this model) designed to encompass multiple NQS domains, and provide future flexibility to incorporate and study newly developed measures over time. New and evolving measures would be considered for inclusion in subsequent years of this model and proposed through future rulemaking.

To create the proposed starter set we began researching the current set of OASIS measures that are being used within the health home environment. Following that, we searched for endorsed quality measures using the National Quality Forum (NQF) Quality Positioning System (QPS), selecting measures that address all possible NQS domains. We further examined measures on the CMS-generated Measures Under Consideration (MUC) list, and reviewed other relevant measures used within the health care industry but not currently used in the home health setting, as well as proposed measures required by the IMPACT Act of 2014. Finally, we searched the National Quality Measures Clearinghouse (NQMS) to identify evidence-based measures and measure sets.

All data for the starter set measures, not including New Measures, is currently collected from HHAs under §§ 484.20 and 484.210.

The NQF Quality Positioning System is available at http://www.qualityforum.org/QPS.

c. Key Policy Considerations and Data Sources

To ensure proposed measures for the HHVBP model take a more holistic view of the patient beyond a particular disease state or care setting, we are proposing measures, which include outcome measures as well as process measures, that have the potential to follow patients across multiple settings, reflect a multi-faceted approach, and foster the intersection of health care delivery and population health. A key consideration behind this approach is to use in performance year one (PY1) of the model proven measures that are readily available and meet a high impact need, and in subsequent model years augment this starter set with innovative measures that have the potential to be impactful and fill critical measure gap areas. All substantive changes or additions to the proposed starter set or new measures would be proposed for inclusion in future rulemaking. This approach to quality measure selection aims to balance the burden of collecting data with the inclusion of new and important measures. We carefully considered the potential burden on HHAs to report the measure data when developing the proposed starter set, and prioritized proposed measures that would draw both from claims data and data already collected in OASIS.

The majority of the proposed measures in this model would use OASIS data currently being reported to CMS and linked to state-specific CCNs for selected states in order to promote consistency and to reduce the data collection burden for providers. Utilizing primarily OASIS data would allow the model to leverage reporting structures already in place to evaluate performance and identify weaknesses in care delivery. This model would also afford the opportunity to study measures developed in other care settings and new to the home health industry (hereinafter referred to as “New Measures”). Many of the proposed New Measures have been used in other health care settings and are readily applicable to the home health environment (for example, influenza vaccination coverage for health care personnel). Proposed New Measures for PY1 are described in detail below. We propose in PY1 to collect data on these New Measures which have already been tested for validity, reliability, usability/feasibility, and sensitivity in other health care settings but have not yet been validated within the home health setting. HHVBP will study if their use in the home health setting meets validity, reliability, usability/feasibility, and sensitivity to statistical variations criteria. For PY1, we propose HHA's would earn points to be included in the Total Performance Score (TPS) simply for reporting data on New Measures (see Section—Performance Scoring Methodology). To the extent we determine that one or more of the proposed New Measures is valid and reliable for the home health setting, we will consider proposing in future rulemaking to score Medicare-certified HHAs on their actual performance on the measure.

3. Proposed Measures

The initial set of measures proposed for PY1 of the model utilizes data collected via OASIS, Medicare claims, HHCAHPS survey data, and data reported directly from the HHAs to CMS. In total there are 10 process measures and 15 outcome measures (see Figure 4a) plus the four New Measures (see Figure 4b). Process measures evaluate the rate of HHA use of specific evidence-based processes of care based on the evidence available. Outcomes measures illustrate the end result of care delivered to HHA patients. When available, NQF endorsed measures would be used. This set of measures would be subject to change or retirement during subsequent model years and revised through the rulemaking process. For example, we may propose in future rulemaking to remove one or more of these measures if, based on the evidence, we conclude that it is no longer appropriate for the model because, for example, performance on it has topped-out. We would also consider proposing to update the measure set if new measures that address gaps within the NQS domains became available. We would also consider proposing adjustments to the measure set based on lessons learned during the course of the model. For instance, in light of the passage of the IMPACT Act of 2014, which mandates the collection and use of standardized post-acute care assessment data, we would consider proposing in future rulemaking to adopt measures that meet the requirements of the IMPACT Act as soon as they became available.

We seek public comment on the methodology for constructing the proposed starter set of quality measures and on the proposed selected measures.

Figure 4a—PY1 Proposed Measures

For more detailed information on the proposed measures utilizing OASIS refer to the OASIS-C1/ICD-9, Changed Items & Data Collection Resources dated September 3, 2014 available at www.oasisanswers.com/LiteratureRetrieve.aspx?ID=215074. For NQF endorsed measures see The NQF Quality Positioning System available at http://www.qualityforum.org/QPS. For non-NQF measures using OASIS see links for data tables related to OASIS measures at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIQualityMeasures.html. For information on HHCAHPS measures see https://homehealthcahps.org/SurveyandProtocols/SurveyMaterials.aspx.

NQS domains Measure title Measure type Identifier Data source Numerator Denominator
Clinical Quality of Care Improvement in Ambulation-Locomotion Outcome NQF0167 OASIS (M1860) Number of home health episodes of care where the value recorded on the discharge assessment indicates less impairment in ambulation/locomotion at discharge than at the start (or resumption) of care Number of home health episodes of care ending with a discharge during the reporting period, other than those covered by generic or measure-specific exclusions.
Clinical Quality of Care Improvement in Bed Transferring Outcome NQF0175 OASIS (M1850) Number of home health episodes of care where the value recorded on the discharge assessment indicates less impairment in bed transferring at discharge than at the start (or resumption) of care Number of home health episodes of care ending with a discharge during the reporting period, other than those covered by generic or measure-specific exclusions.
Clinical Quality of Care Improvement in Bathing Outcome NQF0174 OASIS (M1830) Number of home health episodes of care where the value recorded on the discharge assessment indicates less impairment in bathing at discharge than at the start (or resumption) of care Number of home health episodes of care ending with a discharge during the reporting period, other than those covered by generic or measure-specific exclusions.
Clinical Quality of Care Improvement in Dyspnea Outcome NA OASIS (M1400) Number of home health episodes of care where the discharge assessment indicates less dyspnea at discharge than at start (or resumption) of care Number of home health episodes of care ending with a discharge during the reporting period, other than those covered by generic or measure-specific exclusions.
Clinical Quality of Care Timely Initiation of Care Process NQF0526 OASIS (M0102; M0030) Number of home health episodes of care in which the start or resumption of care date was either on the Physician-specified date or within 2 days of their referral date or inpatient discharge date whichever is later. For resumption of care, per the Medicare Condition of Participation, the patient must be seen within 2 days of inpatient discharge, even if the physician specifies a later date Number of home health episodes of care ending with discharge, death, or transfer to inpatient facility during the reporting period, other than those covered by generic or measure-specific exclusions.
Communication & Care Coordination Discharged to Community Outcome NA OASIS (M2420) Number of home health episodes where the assessment completed at the discharge indicates the patient remained in the community after discharge Number of home health episodes of care ending with discharge or transfer to inpatient facility during the reporting period, other than those covered by generic or measure-specific exclusions.
Communication & Care Coordination Care Management: Types and Sources of Assistance Process NA OASIS (M2102) Multiple data elements Multiple data elements.
Efficiency & Cost Reduction Acute Care Hospitalization: Unplanned Hospitalization during first 60 days of Home Health; Hospitalization during first 30 days of Home Health Outcome NQF0171; NQF2380 (Under review for Home Health) CCW (Claims) Number of home health stays for patients who have a Medicare claim for an admission to an acute care hospital in the 60 days following the start of the home health stay Number of home health stays that begin during the 12-month observation period. A home health stay is a sequence of home health payment episodes separated from other home health payment episodes by at least 60 days.
Efficiency & Cost Reduction Emergency Department Use without Hospitalization Outcome NQF0173 CCW (Claims) Number of home health stays for patients who have a Medicare claim for outpatient emergency department use and no claims for acute care hospitalization in the 60 days following the start of the home health stay Number of home health stays that begin during the 12-month observation period. A home health stay is a sequence of home health payment episodes separated from other home health payment episodes by at least 60 days.
Patient Safety Pressure Ulcer Prevention and Care Process NQF0538 OASIS (M1300; M2400) Number of home health episodes during which interventions to prevent pressure ulcers were included in the Physician-ordered plan of care and implemented (since the previous OASIS assessment) Number of home health episodes of care ending with discharge, or transfer to inpatient facility during the reporting period, other than those covered by generic or measure-specific exclusions.
Patient Safety Improvement in Pain Interfering with Activity Outcome NQF0177 OASIS (M1242) Number of home health episodes of care where the value recorded on the discharge assessment indicates less frequent pain at discharge than at the start (or resumption) of care Number of home health episodes of care ending with a discharge during the reporting period, other than those covered by generic or measure-specific exclusions.
Patient Safety Improvement in Management of Oral Medications Outcome NQF0176 OASIS (M2020) Number of home health episodes of care where the value recorded on the discharge assessment indicates less impairment in taking oral medications correctly at discharge than at start (or resumption) of care Number of home health episodes of care ending with a discharge during the reporting period, other than those covered by generic or measure-specific exclusions
Patient Safety Multifactor Fall Risk Assessment Conducted for All Patients who Can Ambulate Process NQF0537 OASIS (M1910) Number of home health episodes in which patients had a multi-factor fall risk assessment at start/resumption of care Number of home health episodes of care ending with discharge, death, or transfer to inpatient facility during the reporting period, other than those covered by generic or measure-specific exclusions.
Patient Safety Prior Functioning ADL/IADL Outcome NQF0430 OASIS (M1900) The number (or proportion) of a clinician's patients in a particular risk adjusted diagnostic category who meet a target threshold of improvement in Daily Activity (that is, ADL and IADL) functioning All patients in a risk adjusted diagnostic category with a Daily Activity goal for an episode of care Cases to be included in the denominator could be identified based on ICD-9 codes or alternatively, based on CPT codes relevant to treatment goals focused on Daily Activity function.
Patient & Caregiver-Centered Experience Care of Patients Outcome CAHPS NA NA.
Patient & Caregiver-Centered Experience Communications between Providers and Patients Outcome CAHPS NA NA.
Patient & Caregiver-Centered Experience Specific Care Issues Outcome CAHPS NA NA.
Patient & Caregiver-Centered Experience Overall rating of home health care and Outcome CAHPS NA NA.
Patient & Caregiver-Centered Experience Willingness to recommend the agency Outcome CAHPS NA NA.
Population/Community Health Depression Assessment Conducted Process NQF0518 OASIS (M1730) Number of home health episodes in which patients were screened for depression (using a standardized depression screening tool) at start/resumption of care Number of home health episodes of care ending with discharge, death, or transfer to inpatient facility during the reporting period, other than those covered by generic or measure-specific exclusions.
Population/Community Health Influenza Vaccine Data Collection Period: Does this episode of care include any dates on or between October 1 and March 31? Process NA OASIS (M1041) NA NA.
Population/Community Health Influenza Immunization Received for Current Flu Season Process NQF0522 OASIS (M1046) Number of home health episodes during which patients (a) received vaccination from the HHA or (b) had received vaccination from HHA during earlier episode of care, or (c) was determined to have received vaccination from another provider Number of home health episodes of care ending with discharge, or transfer to inpatient facility during the reporting period, other than those covered by generic or measure-specific exclusions.
Population/Community Health Pneumococcal Polysaccharide Vaccine Ever Received Process NQF0525 OASIS (M1051) Number of home health episodes during which patients were determined to have ever received Pneumococcal Polysaccharide Vaccine (PPV) Number of home health episodes of care ending with discharge or transfer to inpatient facility during the reporting period, other than those covered by generic or measure-specific exclusions.
Population/Community Health Reason Pneumococcal vaccine not received Process NA OASIS (M1056) NA NA.
Clinical Quality of Care Drug Education on All Medications Provided to Patient/Caregiver during all Episodes of Care Process NA OASIS (M2015) Number of home health episodes of care during which patient/caregiver was instructed on how to monitor the effectiveness of drug therapy, how to recognize potential adverse effects, and how and when to report problems (since the previous OASIS assessment) Number of home health episodes of care ending with a discharge or transfer to inpatient facility during the reporting period, other than those covered by generic or measure-specific exclusions.

For more detailed information on the proposed measures utilizing OASIS refer to the OASIS-C1/ICD-9, Changed Items & Data Collection Resources dated September 3, 2014 available at www.oasisanswers.com/LiteratureRetrieve.aspx?ID=215074. For NQF endorsed measures see The NQF Quality Positioning System available at http://www.qualityforum.org/QPS. For non-NQF measures using OASIS see links for data tables related to OASIS measures at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIQualityMeasures.html. For information on HHCAHPS measures see https://homehealthcahps.org/SurveyandProtocols/SurveyMaterials.aspx.

Figure 4b—PY1 Proposed New Measures

NQS domains Measure title Measure type Identifier Data source Numerator Denominator
Patient Safety Adverse Event for Improper Medication Administration and/or Side Effects Outcome NA Reported by HHAs through Web Portal Number of home health episodes of care where the discharge/transfer assessment indicated the patient required emergency treatment from a hospital emergency department related to improper administration or medication side effects (adverse drug reactions) Number of home health episodes of care ending with a discharge during the reporting period, other than those covered by generic or measure-specific exclusions.
Population/Community Health Influenza Vaccination Coverage for Home Health Care Personnel Process NQF0431 (Used in other care settings, not Home Health) Reported by HHAs through Web Portal Healthcare personnel in the denominator population who during the time from October 1 (or when the vaccine became available) through March 31 of the following year: (a) Received an influenza vaccination administered at the healthcare facility, or reported in writing or provided documentation that influenza vaccination was received elsewhere: Or (b) were determined to have a medical contraindication/condition of severe allergic reaction to eggs or to other components of the vaccine or history of Guillain-Barre Syndrome within 6 weeks after a previous influenza vaccination; or (c) declined influenza vaccination; or (d) persons with unknown vaccination status or who do not otherwise meet any of the definitions of the above-mentioned numerator categories Number of healthcare personnel who are working in the healthcare facility for at least 1 working day between October 1 and March 31 of the following year, regardless of clinical responsibility or patient contact.
Population/Community Health Herpes zoster (Shingles) vaccination: Has the patient ever received the shingles vaccination? Process NA Reported by HHAs through Web Portal Total number of Medicare beneficiaries aged 60 years and over who report having ever received zoster vaccine (shingles vaccine) Total number of Medicare beneficiaries aged 60 years and over receiving services from the HHA.
Communication & Care Coordination Advanced Care Plan Process NQF0326 Reported by HHAs through Web Portal Patients who have an advance care plan or surrogate decision maker documented in the medical record or documentation in the medical record that an advanced care plan was discussed but the patient did not wish or was not able to name a surrogate decision maker or provide an advance care plan All patients aged 65 years and older.

4. Additional Information on HHCAHPS

Figure 5 provides details on the elements of the Home Health Care Consumer Assessment of Healthcare Providers and Systems Survey (HHCAHPS) we propose to include in the PY1 starter set. The HHVBP model would not alter the HHCAHPS current scoring methodology or the participation requirements in any way. Details on participation requirements for HHCAHPS can be found at 42 CFR 484.250 and details on HHCAHPS scoring methodology are available at https://homehealthcahps.org/SurveyandProtocols/SurveyMaterials.aspx.

76 FR 68606, Nov. 4, 2011, as amended at 77 FR 67164, Nov. 8, 2012; 79 FR 66118, Nov. 6, 2014.

Detailed scoring information is contained in the Protocols and Guidelines manual posted on the HHCAHPS Web site and available at https://homehealthcahps.org/Portals/0/PandGManual_NOAPPS.pdf.

Figure 5—Home Health Care Consumer Assessment of Healthcare Providers and Systems Survey (HHCAHPS) Composites

Response categories
Care of Patients:
Q9. In the last 2 months of care, how often did home health providers from this agency seem informed and up-to-date about all the care or treatment you got at home? Never, Sometimes, Usually, Always.
Q16. In the last 2 months of care, how often did home health providers from this agency treat you as gently as possible? Never, Sometimes, Usually, Always.
Q19. In the last 2 months of care, how often did home health providers from this agency treat you with courtesy and respect? Never, Sometimes, Usually, Always.
Q24. In the last 2 months of care, did you have any problems with the care you got through this agency? Yes, No.
Communications Between Providers & Patients:
Q2. When you first started getting home health care from this agency, did someone from the agency tell you what care and services you would get? Yes, No.
Q15. In the past 2 months of care, how often did home health providers from this agency keep you informed about when they would arrive at your home? Never, Sometimes, Usually, Always.
Q17. In the past 2 months of care, how often did home health providers from this agency explain things in a way that was easy to understand? Never, Sometimes, Usually, Always.
Q18. In the past 2 months of care, how often did home health providers from this agency listen carefully to you? Never, Sometimes, Usually, Always.
Q22. In the past 2 months of care, when you contacted this agency's office did you get the help or advice you needed? Yes, No.
Q23. When you contacted this agency's office, how long did it take for you to get the help or advice you needed? Same day; 1 to 5 days; 6 to 14 days; More than 14 days.
Specific Care Issues:
Q3. When you first started getting home health care from this agency, did someone from the agency talk with you about how to set up your home so you can move around safely? Yes, No.
Q4. When you started getting home health care from this agency, did someone from the agency talk with you about all the prescription medicines you are taking? Yes, No.
Q5. When you started getting home health care from this agency, did someone from the agency ask to see all the prescription medicines you were taking? Yes, No.
Q10. In the past 2 months of care, did you and a home health provider from this agency talk about pain? Yes, No.
Q12. In the past 2 months of care, did home health providers from this agency talk with you about the purpose for taking your new or changed prescription medicines? Yes, No.
Q13. In the last 2 months of care, did home health providers from this agency talk with you about when to take these medicines? Yes, No.
Q14. In the last 2 months of care, did home health providers from this agency talk with you about the important side effects of these medicines? Yes, No.
Global Type Measures:
What is your overall rating of your home health care? Use a rating scale (1-10).
Would you be willing to recommend this home health agency to family and friends? Never, Sometimes, Usually, Always.

5. New Measures

As discussed in the previous section, the New Measures we propose are not currently reported by Medicare-certified HHAs to CMS, but we believe fill gaps in the NQS Domains not completely covered by existing measures in the home health setting. All Medicare-certified HHAs in selected states, regardless of cohort size or number of episodes, will be required to submit data on the New Measures for all Medicare beneficiaries to whom they provide home health services within the state (unless an exception applies). We propose at § 484.315 that HHAs will be required to report data on these New Measures. Competing Medicare-certified HHAs would submit data through a dedicated HHVBP web-based platform. This web-based platform would function as a means to collect and distribute information from and to competing Medicare-certified HHAs. Also, for those HHAs with a sufficient number of episodes of care to be subject to a payment adjustment, New Measures scores included in the final TPS for PY1 are only based on whether the HHA has submitted data to the HHVBP web-based platform or not. We are proposing the following New Measures for competing Medicare-certified HHAs:

  • Advance Care Planning;
  • Adverse Event for Improper Medication Administration and/or Side Effects;
  • Influenza Vaccination Coverage for Home Health Care Personnel; and,
  • Herpes Zoster (Shingles) Vaccination received by HHA patients.

a. Advance Care Planning

Advance Care Planning is an NQF-endorsed process measure in the NQS domain of Person- and Caregiver-centered experience and outcomes (see Figure 3). This measure is currently endorsed at the group practice/individual clinician level of analysis. We believe its adoption under the HHVBP model represents an opportunity to study this measure in the home health setting. This is an especially pertinent measure for home health care to ensure that the wishes of the patient regarding their medical, emotional, or social needs are met across care settings. The Advance Care Planning measure would focus on Medicare beneficiaries, including dually-eligible beneficiaries.

The measure would be numerically expressed by a ratio whose numerator and denominator are as follows:

Numerator: The measure would calculate the percentage of patients age 18 years and older served by the HHA that have an advance care plan or surrogate decision maker documented in the clinical record or documentation in the clinical record that an advance care plan was discussed, but the patient did not wish or was not able to name a surrogate decision maker or provide an advance care plan.

A surrogate decision maker, also known as a health care proxy or agent, advocates for patients who are unable to make decisions or speak for themselves about personal health care such that someone else must provide direction in decision-making, as the surrogate decision-maker.

Denominator: All patients aged 65 years and older admitted to the HHA.

Information on this numerator and denominator would be reported by HHAs through the HHVBP web-based platform, in addition to other information related to this measure as the Secretary deems appropriate.

Advance care planning ensures that the health care plan is consistent with the patient's wishes and preferences. Therefore, studying this measure within the HHA environment allows for further analysis of planning for the “what ifs” that may occur during the patient's lifetime. In addition, the use of this measure is expected to result in an increase in the number of patients with advance care plans. Increased advance care planning among the elderly is expected to result in enhanced patient autonomy and reduced hospitalizations and in-hospital deaths.

Lauren Hersch Nicholas, Ph.D., MPP et al. Regional Variation in the Association Between Advance Directives and End-of-Life Medicare Expenditures. JAMA. 2011; 306(13): 1447-1453. doi:10.1001/jama.2011.1410.

We welcome public comments on this measure's proposed adoption under the HHVBP model.

b. Adverse Event for Improper Medication Administration and/or Side Effects

Adverse Event for Improper Medication Administration and/or Side Effects is a measure that aligns with the NQS domain of Safety (specifically “medication safety”—see Figure 3) with the goal of making care safer by reducing harm caused in the delivery of care.

An adverse drug event (ADE) is an injury related to medication use. More specifically, it is “an injury resulting from medical intervention related to a drug” and “encompasses harms that occur during medical care that are directly caused by the drug including but not limited to medication errors, adverse drug reactions and overdoses.” A medication error is a mishap “that occur[s] during prescribing, transcribing, dispensing, administering, adherence, or monitoring a drug” and should be distinguished from an adverse drug reaction, which is harm directly caused by the drug at normal doses, during normal use. The National Quality Forum has included ADEs as a Serious Reportable Event (SRE) in the category of Care Management, defining said event as a “patient death or serious injury associated with a medication error (for example, errors involving the wrong drug, wrong dose, wrong patient, wrong time, wrong rate, wrong preparation, or wrong route of administration)”, noting that “. . . the high rate of medication errors resulting in injury and death makes this event important to endorse again.”

Reporting of Adverse Drug Events: Examination of a Hospital Incident Reporting System. Radhika Desikan, Melissa J. Krauss, W. Claiborne Dunagan, Erin Christensen Rachmiel, Thomas Bailey, Victoria J. Fraser http://www.ahrq.gov/professionals/quality-patient-safety/patient-safety-resources/resources/advances-in-patient-safety/vol1/Desikan.pdf.

The Office of Disease Prevention and Health Promotion (ODPHP), National Action Plan for ADE Prevention, available at: http://www.health.gov/hai/pdfs/ADE-Action-Plan-Executive-Summary.pdf,, citing VA Center for Medication Safety And VHA Pharmacy Benefits Management Strategic Healthcare Group and the Medical Advisory Panel Adverse Drug Events, Adverse Drug Reactions and Medication Errors Frequently Asked Questions (November 2006), available at: http://www.va.gov/ms/professionals/medications/adverse_drug_reaction_faq.pdfhttp://www.va.gov/ms/professionals/medications/adverse_drug_reaction_faq.pdf.

VA Center for Medication Safety And VHA Pharmacy Benefits Management Strategic Healthcare Group and the Medical Advisory Panel Adverse Drug Events, Adverse Drug Reactions and Medication Errors Frequently Asked Questions (November 2006), available at: http://www.va.gov/ms/professionals/medications/adverse_drug_reaction_faq.pdf.http://www.va.gov/ms/professionals/medications/adverse_drug_reaction_faq.pdf. Note that this VA document urges that the term Adverse Drug Reaction should generally be used rather than the term “side effect” because the latter ” tends to normalize the concept of injury from drugs. This approach has been adopted in the National Action Plan for ADE Prevention, in which the term “side effects” does not appear. See: The Office of Disease Prevention and Health Promotion (ODPHP), National Action Plan for ADE Prevention, available at: http://www.health.gov/hai/pdfs/ADE-Action-Plan-Executive-Summary.pdf.

The annual incidence of ADEs in health care in the United States is high; authoritative estimates indicate that each year 400,000 preventable ADEs occur in hospitals, 800,000 in long term care settings and in excess of 500,000 among Medicare patients in outpatient settings. The cost of ADEs occurring in hospitals alone has been estimated at $5.6 billion. Older patients are particularly vulnerable to adverse drug reactions and are seven times as likely as younger persons to experience an adverse drug event requiring hospitalization. Further, we are specifically concerned that “Analyses of cost data indicate that Medicare patients experience significantly higher rates of ADEs than both privately insured and Medicaid-covered patients.” Prevention of ADEs is a national Patient Safety Priority pursuant to the ADE National Action Plan, which focuses on vulnerable population groups, one of which is the elderly. Most work on ADEs has taken place in the hospital setting. There is little available data regarding the incidence and types of ADEs occurring in home health care for the elderly under Medicare. We believe there is a critical need for such information with regard to patient safety, and we are proposing this measure to address that need.

The Institute of Medicine, Preventing Medication Errors (2006), at 5.). Available at: http://books.nap.edu/openbook.php?record_id=11623&page=5.

National Quality Forum, NQF-Endorsed Measures for Patient Safety DRAFT REPORT FOR COMMENT (May 28, 2014), at 6. Available at: www.qualityforum.org/WorkArea/linkit.aspx?LinkIdentifier=id.

Emergency Hospitalizations for Adverse Drug Events in Older Americans Daniel S. Budnitz, M.D., M.P.H., Maribeth C. Lovegrove, M.P.H., Nadine Shehab, Pharm.D., M.P.H., and Chesley L. Richards, M.D., M.P.H.,N Engl J Med 2011; 365: 2002-2012 available at: http://www.nejm.org/doi/full/10.1056/NEJMsa1103053.

The Office of Disease Prevention and Health Promotion (ODPHP), National Action Plan for ADE Prevention, available at: http://www.health.gov/hai/pdfs/ADE-Action-Plan-Executive-Summary.pdf.

The measure would be numerically expressed by a ratio whose numerator and denominator are as follows:

Numerator: Number of home health episodes of care where the discharge/transfer assessment indicated the patient required emergency treatment from a hospital emergency department related to improper administration or medication side effects (adverse drug reactions).

Denominator: Number of home health episodes of care ending with a discharge during the performance period. Numbers to be specifically excluded from the ratio as a measure-specific exclusion are those relating to home health episodes of care for which emergency department use or the reason for emergency department use is unknown at transfer or discharge. Stated otherwise, the measure would be expressed by a ratio indicating the relationship between (i) the number of emergency treatments transferring or discharged patients sought or received for OASIS C M2310, “1-Improper medication administration, adverse drug reactions, medication side effects, toxicity, anaphylaxis” and (ii) the number of emergency treatments sought or received for one of the other reasons identified by OASIS-C M2310. Neither number would include (a) incidents where the reason checked on M2310 is “UK-Reason unknown” or (b) incidents where use of emergency department was unknown at transfer or discharge. Data for this measure would be reported by HHAs through the dedicated HHVBP web-based platform based on OASIS C/ICD 9/10 Items M2300 Emergent Care and M2310 Reasons for Emergent Care, in addition to other information related to this measure as the Secretary deems appropriate.

We welcome public comments on this measure's proposed adoption under the HHVBP model.

c. Influenza Vaccination Coverage for Home Health Care Personnel

Staff Immunizations (Influenza Vaccination Coverage among Health Care Personnel) (NQF #0431) is an NQF-endorsed measure that addresses the NQS domain of Population Health (see Figure 3). The measure is currently endorsed in Ambulatory Care; Ambulatory Surgery Center (ASC), Ambulatory Care; Clinician Office/Clinic, Dialysis Facility, Hospital/Acute Care Facility, Post-Acute/Long Term Care Facility; Inpatient Rehabilitation Facility, Post-Acute/Long Term Care Facility; Long Term Acute Care Hospital, and Post-Acute/Long Term Care Facility: Nursing Home/Skilled Nursing Facility. Home health care is among the only remaining settings for which the measure has not been endorsed. We believe the proposed HHVBP model presents an opportunity to study this measure in the home health setting. This measure is currently reported in multiple CMS quality reporting programs, including Ambulatory Surgical Center Quality Reporting, Hospital Inpatient Quality Reporting, and Long-Term Care Hospital Quality Reporting; we believe its adoption under the proposed HHVBP model presents an opportunity for alignment in our quality programs. The documentation of staff immunizations is also a standard required by many HHA accrediting organizations. We believe that this measure would be appropriate for HHVBP because it addresses total population health across settings of care by reducing the exposure of individuals to a potentially avoidable virus.

The measure would be numerically expressed by a ratio whose numerator and denominator are as follows:

Numerator: The measure would calculate the percentage of home health care personnel who receive the influenza vaccine, and document those who do not receive the vaccine in the articulated categories below:

(1) Received an influenza vaccination administered at the health care agency, or reported in writing (paper or electronic) or provided documentation that influenza vaccination was received elsewhere; or

(2) Were determined to have a medical contraindication/condition of severe allergic reaction to eggs or to other component(s) of the vaccine, or history of Guillain-Barré Syndrome within 6 weeks after a previous influenza vaccination; or

(3) Declined influenza vaccination; or

(4) Persons with unknown vaccination status or who do not otherwise meet any of the definitions of the above-mentioned numerator categories.

Each of the above groups would be divided by the number of health care personnel who are working in the HHA for at least one working day between October 1 and March 31 of the following year, regardless of clinical responsibility or patient contact.

Denominator: This measure collects the number of home health care personnel who, during the flu season: Denominators are to be calculated separately for the following three groups:

Flu season is generally October 1 (or when the vaccine became available) through March 31 of the following year. See URL http://www.cdc.gov/flu/about/season/flu-season.htm for detailed information.

1. Employees: All persons who receive a direct paycheck from the reporting HHA (that is, on the agency's payroll);

2. Licensed independent practitioners: Include physicians (MD, DO), advanced practice nurses, and physician assistants only who are affiliated with the reporting agency who do not receive a direct paycheck from the reporting HHA; and

3. Adult students/trainees and volunteers: Include all adult students/trainees and volunteers who do not receive a direct paycheck from the reporting HHA.

This proposed measure for the HHVBP model is expected to result in increased influenza vaccination among home health professionals. Reporting health care personnel influenza vaccination status would allow HHAs to better identify and target unvaccinated personnel. Increased influenza vaccination coverage among HHA personnel would be expected to result in reduced morbidity and mortality related to influenza virus infection among patients, especially elderly and vulnerable populations.

Carman W.F., Elder A.G., Wallace L.A., et al. Effects of influenza vaccination of health-care workers on mortality of elderly people in long-term care: A randomized controlled trial. Lancet 2000; 355:93-97.

Information on the above numerator and denominator would be reported by HHAs through the HHVBP web-based platform, in addition to other information related to this measure as the Secretary deems appropriate. We welcome public comments on this measure's proposed adoption under the HHVBP model.

d. Herpes Zoster Vaccine (Shingles Vaccine) for Patients

We are proposing to adopt this measure for the HHVBP model because it aligns with the NQS Quality Strategy Goal to Promote Effective Prevention & Treatment of Chronic Disease. Currently this proposed measure is not endorsed by NQF or collected in OASIS. However, due to the severe physical consequences of symptoms associated with shingles, we view its adoption under the HHVBP model as an opportunity to perform further study on this measure. The results of this analysis could provide the necessary data to meet NQF endorsement criteria. The measure would calculate the percentage of home health patients who receive the Shingles vaccine, and collect the number of patients who did not receive the vaccine.

For detailed information on Shingles incidences and known complications associated with this condition see CDC information available at http://www.cdc.gov/shingles/about/overview.html.

Numerator: Equals the total number of Medicare beneficiaries aged 60 years and over who report having ever received herpes zoster vaccine (shingles vaccine) during the home health episode of care.

Denominator: Equals the total number of Medicare beneficiaries aged 60 years and over receiving services from the HHA.

The Food and Drug Administration (FDA) has approved the use of herpes zoster vaccine in adults age 50 and older. In addition, the Advisory Committee on Immunization Practices (ACIP) currently recommends that herpes zoster vaccine be routinely administered to adults, age 60 years and older. In 2013, 24.2 percent of adults 60 years and older reported receiving herpes zoster vaccine to prevent shingles, an increase from the 20.1 percent in 2012, yet below the targets recommended in the HHS Healthy People 2020 initiative.

CDC. Morbidity and Mortality Weekly Report 2011; 60(44):1528.

CDC. Morbidity and Mortality Weekly Report 2015; 64(04):95-102.

Healthy People 2020: Objectives and targets for immunization and infectious diseases. Available at https://www.healthypeople.gov/2020/topics-objectives/topic/immunization-and-infectious-diseases/objectives.

The incidence of herpes zoster outbreak increases as people age, with a significant increase after age 50. Older people are more likely to experience the severe nerve pain known as post-herpetic neuralgia (PHN), the primary acute symptom of shingles infection, as well as non-pain complications, hospitalizations, and interference with activities of daily living. Studies have shown for adults aged 60 years or older the vaccine's efficacy rate for the prevention of herpes zoster is 51.3 percent and 66.5 percent for the prevention of PHN for up to 4.9 years after vaccination. The Short-Term Persistence Sub study (STPS) followed patients 4 to 7 years after vaccination and found a vaccine efficacy of 39.6 percent for the prevention of herpes zoster and 60.1 percent for the prevention of PHN. The majority of patients reporting PHN are over age 70; vaccination of this older population would prevent most cases, followed by vaccination at age 60 and then age 50.

Yawn B.P., Saddier P., Wollen P.C., St Sauvier J.L., Kurland M.J., Sy L.S. A population-based study of the incidence and complication rate of herpes zoster before zoster vaccine introduction. Mayo Clinic Proc 2007; 82:1341-9.

Lin F., Hadler J.L. Epidemiology of primary varicella and herpes zoster hospitalizations: The pre-varicella vaccine era. J Infect Dis 2000; 181:1897-905.

Schmader K.E., Johnson G.R., Saddier P., et al. Effect of a zoster vaccine on herpes zoster-related interference with functional status and health-related quality-of-life measures in older adults. J Am Geriatr Soc 2010; 58:1634-41.

Schmader K.E., Johnson G.R., Saddier P., et al. Effect of a zoster vaccine on herpes zoster0-related interference with functional status and health-related quality-of-life measures in older adults. J Am Geriatr Soc 2010; 58:1634-41.

Schmader K.E., Oxman M.N., Levin M.J., Johnson G., Zhang J.H., Betts R., Morrison V.A., Gelb L., Guatelli J.C., Harbecke R., Pachucki C., Keay S., Menzies B., Griffin M.R., Kauffman C., Marques A., Toney J., Keller P.M., LI,X, Chan L.S.F., Annumziato P. Persistence of the Efficacy of Zoster Vaccine in the Shingles Prevention Study and the Short Term Persistence Substudy. Clinical Infectious Disease 2012; 55:1320-8.

Studying this measure in the home health setting presents an ideal opportunity to address a population at risk which would benefit greatly from this vaccination strategy. For example, receiving the vaccine will often reduce the course and severity of the disease and reduce the risk of post herpetic neuralgia.

Information on the above numerator and denominator would be reported by HHAs through the HHVBP web-based platform, in addition to other information related to this measure as the Secretary deems appropriate. We welcome public comments on this measure's proposed adoption under the HHVBP model.

6. HHVBP Model's Four Classifications

As previously stated, the quality measures that we are proposing to use in the performance years are aligned with the six NQS domains: Patient and Caregiver-centered experience and outcomes; Clinical quality of care; Care coordination; Population Health; Efficiency and cost reduction; and, Safety (see Figure 6).

We propose to filter these NQS domains and the proposed HHVBP quality measures into four classifications to align directly with the measure weighting utilized in calculating payment adjustments. The four HHVBP classifications we are proposing are: Clinical Quality of Care, Outcome and Efficiency, Person- and Caregiver-Centered Experience, and New Measures reported by the HHAs.

These four classifications capture the multi-dimensional nature of health care provided by the HHA. These classifications are further defined as:

  • Classification I—Clinical Quality of Care: Measures the quality of health care services provided by eligible professionals and paraprofessionals within the home health environment.
  • Classification II—Outcome and Efficiency: Outcomes measure the end result of care provided to the beneficiary. Efficiencies measure maximizing quality and minimizing use of resources.
  • Classification III—Person- and Caregiver-Centered Experience: Measures the beneficiary and their caregivers' experience of care.
  • Classification IV—New Measures: Measures not currently reported by Medicare-certified HHAs to CMS, but that may fill gaps in the NQS Domains not completely covered by existing measures in the home health setting.

We seek public comment on our proposed measure classifications for the HHVBP model.

7. Weighting

We propose that measures within each classification will be weighted the same for the purposes of payment adjustment. We are weighting at the individual measure level and not the classification level. Classifications are for organizational purposes only. We selected this approach since we did not want any one measure within a classification to be more important than another measure. This approach ensures that a measure's weight will remain the same even if some of the measures within a classification group have no available data. Weighting will be re-examined in subsequent years of the model and be subject to the rulemaking process.

We welcome public comments on this proposed weighting methodology under the HHVBP model.

F. Performance Scoring Methodology

1. Performance Calculation Parameters

The methodology we are proposing for assessing each HHA's total annual performance is based on a score calculated using the proposed starter set of quality measures that apply to the HHA (based on a minimum number of cases, as discussed herein). The methodology we propose would provide an assessment on a quarterly basis for each HHA and would result in an annual distribution of value-based payment adjustments among HHAs so that HHAs achieving the highest performance scores would receive the largest upward payment adjustment. The methodology we are proposing includes three primary features:

  • The HHA's Total Performance Score (TPS) would be determined using the higher of an HHA's achievement or improvement score for each measure;
  • All measures in the Clinical Quality of Care, Outcome and Efficiency, and Person and Caregiver-Centered Experience classifications will have equal weight and will account for 90 percent of the TPS (see section 2 below) regardless of the number of measures in the three classifications. Points for New Measures are awarded for submission of data on the New Measures via the HHVBP web-based platform, and withheld if data is not submitted. Data reporting for each New Measure will have equal weight and will account for 10 percent of the TPS for the first performance year; and,
  • The HHA performance score would reflect all of the measures that apply to the HHA based on a minimum number of cases defined below.

2. Considerations for Calculating the Total Performance Score

In § 484.320 we propose to calculate the TPS by adding together points awarded to Medicare-certified HHAs on the starter set of measures, including the New Measures. We considered several factors when developing the proposed performance scoring methodology for the HHVBP model. First, we believe it is important that the performance scoring methodology be straightforward and transparent to HHAs, patients, and other stakeholders. HHAs must be able to clearly understand performance scoring methods and performance expectations to maximize quality improvement efforts. The public must understand performance score methods to utilize publicly-reported information when choosing HHAs.

Second, we believe the proposed performance scoring methodology for the HHVBP model should be aligned appropriately with the quality measurements adopted for other Medicare value-based purchasing programs including those introduced in the hospital and skilled nursing home settings. This alignment would facilitate the public's understanding of quality measurement information disseminated in these programs and foster more informed consumer decision-making about their health care choices.

Third, we believe that differences in performance scores must reflect true differences in quality performance. To ensure that this point is addressed in the proposed performance scoring methodology for the HHVBP model, we assessed quantitative characteristics of the measures, including the current state of measure development, number of measures, and the number and grouping of measure classifications.

Fourth, we believe that both quality achievement and improvement must be measured appropriately in the performance scoring methodology for the HHVBP model. The proposed methodology specifies that performance scores under the HHVBP model are calculated utilizing the higher of achievement or improvement scores for each measure. The impact of performance scores utilizing achievement and improvement on HHAs' behavior and the resulting payment implications was also considered. Using the higher of achievement or improvement scores allows the model to recognize HHAs that have made great improvements, though their measured performance score may still be relatively lower in comparison to other HHAs.

Fifth, through careful measure selection we intend to eliminate, or at least control for, unintended consequences such as undermining better outcomes to patients or rewarding inappropriate care. As discussed above, when available, NQF endorsed measures would be used. In addition we propose to adopt measures that we believe are closely associated with better outcomes in the HHA setting in order to incentivize genuine improvements and sustain positive achievement while retaining the integrity of the model.

Sixth, we intend to ensure the model utilizes the most currently available data to assess HHA performance. We recognize that these data would not be available instantaneously due to the time required to process quality measurement information accurately; however, we intend to make every effort to process data in the timeliest fashion. Using more current data would result in a more accurate performance score while recognizing that HHAs need time to report measure data.

3. Additional Considerations for the Proposed HHVBP Total Performance Scores

Many of the key elements of the proposed HHVBP model performance scoring methodology would be aligned with the scoring methodology of the Hospital Value-Based Purchasing Program (HVBP) in order to leverage the rigorous analysis and review underpinning that Program's approach to value-based purchasing in the hospital sector. The HVBP Program includes as one of its core elements the scoring methodology included in the 2007 Report to Congress “Plan to Implement a Medicare Hospital Value-Based Purchasing Program” (hereinafter referred to as “The 2007 HVBP Report”). The 2007 HVBP Report describes a Performance Assessment Model with core elements that can easily be replicated for other value-based purchasing programs or models, including the HHVBP.

In the HVBP Program, the Performance Assessment Model aggregates points on the individual quality measures across different quality measurement domains to calculate a hospital's TPS. Similarly, the proposed HHVBP model would aggregate points on individual measures across four measure classifications derived from the 6 CMS/NQS domains as described above (see Figure 3) to calculate the HHA's TPS. In addition, the proposed HHVBP payment methodology is also aligned with the HVBP Program with respect to evaluating an HHA's performance on each quality measure based on the higher of an achievement or improvement score in the performance period. The proposed model is not only designed to provide incentives for HHAs to provide the highest level of quality, but also to provide incentives for HHAs to improve the care they provide to Medicare beneficiaries. By rewarding HHAs that provide high quality and/or high improvement, we believe the proposed HHVBP model would ensure that all HHAs would be incentivized to commit the resources necessary to make the organizational changes that would result in better quality.

Under the proposed model an HHA would be awarded points only for “applicable measures.” An “applicable measure” is one for which the HHA has provided 20 home health episodes of care per year. Points awarded for each applicable measure would be aggregated to generate a TPS. As described in the benchmark section below, HHAs would have the opportunity to receive 0 to 10 points for each measure in the Clinical Quality of Care, Outcome and Efficiency, and Person and Caregiver-Centered Experience classifications. Each measure would have equal weight regardless of the total number of measures in each of the first three classifications. In contrast, we propose to score the New Measures in a different way. For each New Measure, HHAs would receive 10 points if they report the New Measure or 0 points if they do not report the measure during the performance year. In total, the New Measures would account for 10 percent of the TPS regardless of the number of measures applied to an HHA in the other three classifications.

We propose to calculate the TPS for the HHVBP methodology similarly to the TPS calculation that has been finalized under the HVBP program. The performance scoring methodology for the HHVBP model would include determining performance standards (benchmarks and thresholds) using the 2015 baseline period performance year's quality measure data, scoring HHAs based on their achievement and/or improvement with respect to those performance standards, and weighting each of the classifications by the number of measures employed, as presented in further detail in Section G below.

4. Setting Performance Benchmarks and Thresholds

For scoring HHAs' performance on measures in the proposed Clinical Quality of Care, Outcome and Efficiency, and Person and Caregiver-Centered Experience classifications, we propose that the HHVBP model would adopt an approach using several key elements from the scoring methodology set forth in the 2007 HVBP Report and the successfully implemented HVBP Program including allocating points based on achievement or improvement, and calculating those points based on industry benchmarks and thresholds.

For detailed information on HVBP scoring see http://www.medicare.gov/hospitalcompare/data/hospital-vbp.html.

In determining the achievement points for each measure, HHAs would receive points along an achievement range, which is a scale between the achievement threshold and a benchmark. We propose to calculate the achievement threshold as the median of all HHAs' performance on the specified quality measure during the baseline period and to calculate the benchmark as the mean of the top decile of all HHAs' performance on the specified quality measure during the baseline period. Unlike the HVBP Program that uses a national sample, this model would calculate both the achievement threshold and the benchmark separately for each selected state and for HHA cohort size. Under this proposed methodology, we would have benchmarks and achievement thresholds for both the larger-volume cohort and for the smaller-volume cohort of HHAs (defined in each state based on a baseline period and proposed to run from January 1, 2015 through December 31, 2015). Another way HHVBP differs from the Hospital VBP is this model only uses 2015 as the baseline year for the measures included in the proposed starter set. For the starter set used in the model, 2015 will consistently be used as the baseline period in order to evaluate the degree of change that may occur over the multiple years of the model. In determining improvement points for each measure, we propose that HHAs would receive points along an improvement range, which is a scale indicating change between an HHA's performance during the performance period and the baseline period. In addition, as in the achievement calculation, the benchmark and threshold would be calculated separately for each state and for HHA cohort size to ensure that HHAs would only be competing with those HHAs in their state and their size cohort. Grouping HHAs by state and size is another way that the HHVBP payment methodology differs from the HVBP.

5. Calculating Achievement and Improvement Points

a. Achievement Scoring

We are proposing that achievement scoring under the HHVBP model would be based on the Performance Assessment Model set forth in the 2007 HVBP Report and as implemented under the HVBP Program. An HHA would earn 0-10 points for achievement for each measure in the Clinical Quality of Care, Outcome and Efficiency, and Person and Caregiver-Centered Experience classifications based on where its performance during the performance period falls relative to the achievement threshold and the benchmark, according to the following formula:

All achievement points would be rounded up or down to the nearest point (for example, an achievement score of 4.555 would be rounded to 5). HHAs would receive an achievement score as follows:

  • An HHA with performance equal to or higher than the benchmark would receive the maximum of 10 points for achievement.
  • An HHA with performance equal to or greater than the achievement threshold (but below the benchmark) would receive 1-9 points for achievement, by applying the formula above.
  • An HHA with performance less than the achievement threshold would receive 0 points for achievement.

We welcome public comment on this proposed methodology for scoring HHAs on achievement under the proposed HHVBP model.

b. Improvement Scoring

In keeping with the approach used by the HVBP program, we propose that an HHA would earn 0-10 points based on how much its performance during the performance period improved from its performance on each measure in the proposed Clinical Quality of Care, Outcome and Efficiency, and Person and Caregiver-Centered Experience classifications during the baseline period. A unique improvement range for each measure would be established for each HHA that defines the difference between the HHA's baseline period score and the same state and size level benchmark for the measure used in the achievement scoring calculation described previously, according to the following formula:

All improvement points would be rounded to the nearest point. If an HHA's performance on the measure during the performance period was:

  • Equal to or higher than the benchmark score, the HHA would receive an improvement score of 10 points;
  • Greater than its baseline period score but below the benchmark (within the improvement range), the HHA would receive an improvement score of 0-10, based on the formula above; or
  • Equal to or lower than its baseline period score on the measure, the HHA would receive 0 points for improvement.

We welcome public comments on this proposed methodology for scoring HHAs on improvement under the proposed HHVBP model.

c. Examples of Calculating Achievement and Improvement Scores

For illustrative purposes we present the following examples of how the proposed performance scoring methodology would be applied in the context of the proposed measures in the proposed Clinical Quality of Care, Outcome and Efficiency, and Person and Caregiver-Centered Experience classifications. These HHA examples were selected from an empirical database created from 2013/2014 data from the Home Health Compare archived data, claims data and enrollment data to support the development of the HHVBP permutation of the Performance Assessment Model, and all performance scores are calculated for the pneumonia measure, with respect to the number of individuals assessed and administered the pneumococcal vaccine.

Figure 7 shows the scoring for HHA `A', as an example. The benchmark calculated for the pneumonia measure in this case was 0.87 (the mean value of the top decile in 2013), and the achievement threshold was 0.47 (the performance of the median or the 50th percentile among HHAs in 2013). HHA A's 2014 performance rate of 0.91 during the performance period for this measure exceeds the benchmark, so HHA A would earn 10 (the maximum) points for its achievement score. The HHA's performance rate on a measure is expressed as a decimal. In the illustration, HHA A's performance rate of 0.91 means that 91 percent of the applicable patients that were assessed were given the pneumococcal vaccine. In this case, HHA A has earned the maximum number of 10 possible achievement points for this measure and thus, its improvement score is irrelevant in the calculation.

Figure 7 also shows the scoring for HHA `B'. As referenced below, HHA B's performance on this measure went from 0.21 (which was below the achievement threshold) in the baseline period to 0.70 (which is above the achievement threshold) in the performance period. Applying the achievement scale, HHA B would earn 6 points for achievement, calculated as follows: [9 * ((0.70 − 0.47)/(0.87 − 0.47))] + 0.5 = 5.675, and then rounded to 6 points.

Checking HHA B's improvement score yields the following result: Based on HHA B's period-to-period improvement, from 0.21 in the baseline year to 0.70 in the performance year, HHA B would earn 7 points, calculated as follows: [10 * ((0.70 − 0.21)/(0.87 − 0.21))] − 0.5 = 6.92, rounded to 7 points. Because the higher of the achievement and improvement scores is used, HHA B would receive 7 points for this measure.

In Figure 8, HHA `C' yielded a decline in performance on the pneumonia measure, falling from 0.57 to 0.46 (a decline of 0.11 points). HHA C's performance during the performance period is lower than the achievement threshold of 0.47 and, as a result, receives 0 points based on achievement. It also receives 0 points for improvement, because its performance during the performance period is lower than its performance during the baseline period.

6. Proposed Scoring Methodology for New Measures

The HHVBP model provides us with the opportunity to study new quality measures. The four New Measures that we have proposed to adopt for the model for PY1 would be reported directly by the HHA and would account for 10 percent of the TPS regardless of the number of measures in the other three classifications. We are proposing that HHAs that report on these measures would receive 10 points out of a maximum of 10 points for each of the 4 measures in the New Measure classification. Hence a HHA that reports on all four measures would receive 40 points out of a maximum of 40. An HHA would receive 0 points for each measure that it fails to report on. If an HHA reports on all four measures, it would receive 40 points for the classification and 10 points (40/40 * 10 points) would be added to its TPS because the New Measure classification has a maximum weight of 10 percent. If an HHA reports on 3 of 4 measures, it would receive 30 points of 40 points available for the classification and 7.5 points (30/40 * 10 points) added to its TPS. If an HHA reports on 2 of 4 measures, they would receive 20 points of 40 points available for the classification and 5.0 points (20/40 * 10 points) added to their TPS. If an HHA reports on 0 of 4 measures, they would receive 0 points and have no points added to their TPS. We intend to update these measures through future rulemaking to allow us to study newer, leading-edge measures as well as retire measures that no longer require such analysis. We request comment on this proposed scoring methodology for new measures.

7. Minimum Number of Cases for Outcome and Clinical Quality Measures

While no HHA in a selected state would be exempt from the HHVBP model, there may be periods when an HHA does not receive a payment adjustment because there are not an adequate number of episodes of care to generate sufficient quality measure data. The minimum threshold for an HHA to receive a score on a given measure is 20 home health episodes of care per year for HHAs that have been certified for at least 6-months. If an HHA does not meet this threshold to generate scores on five or more of the Clinical Quality of Care, Outcome and Efficiency, and Person and Caregiver-Centered Experience measures, no payment adjustment will be made, and the Medicare-certified HHA would be paid for HHA services in an amount equivalent to the amount it would have been paid under section 1895 of the Act.

HHVBP would follow the Home Health Compare Web site policy not to report measures on HHAs that have less than 20 observations for statistical reasons concerning the power to detect reliable differences in the quality of care.

HHAs with very low volumes will either increase their volume in later performance years and be subject to future payment adjustment, or the HHAs' volume will remain very low and the HHAs would continue to not have their payment adjusted in future years. Based on the most recent data available at this time, a very small number of HHAs are reporting on less than five of the total number of measures included in the Clinical Quality of Care, Outcome and Efficiency, and Person and Caregiver-Centered Experience classifications and account for less than 0.5 percent of the claims made over 1,900 HHAs delivering care within the nine proposed selected states. We expect very little impact of very low service volume HHAs on the model due to the low number of low volume HHAs and because it is unlikely that a HHA will reduce the amount of service to such a low level to avoid a payment adjustment. Although these HHAs would not be subject to payment adjustments, they would remain in the model and have access to the same technical assistance as all other HHAs in the model, and would receive quality reports on any measures for which they do have 20 episodes of care, and a future opportunity to compete for payment adjustments.

We propose the HHA's TPS would be based on all the Clinical Quality of Care, Outcome and Efficiency, Person and Caregiver-Centered Experience measures and the New Measures that apply to the HHA. As described above, each measure in the Clinical Quality of Care, Outcome and Efficiency and Person and Caregiver-Centered Experience classifications would be weighted equally. Each measure would have an equal weight relative to the total score of the three classifications regardless of the number of measures that are applicable.

As an example, HHA “A” has at least 20 episodes of care in a 12-month period for only 9 quality measures out of a possible 25 measures from three of the four classifications (except the New Measures). Under the proposed scoring methodology outlined above, HHA A would be awarded 0, 0, 3, 4, 5, 7, 7, 9, and 10 points, respectively, for these measures. HHA A's total earned points for the three classifications would be calculated by adding together all the points awarded to HHA A, resulting in a total of 45 points. HHA A's total possible points would be calculated by multiplying the total number of measures for which the HHA reported on least 20 episodes (nine) by the maximum number of points for those measures (10), yielding a total of 90 possible points. HHA A's score for the three classifications would be the total earned points (45) divided by the total possible points (90) multiplied by 90 because as mentioned in section E7, the Clinical Quality of Care, Outcome and Efficiency, and Person and Caregiver-Centered Experience classifications account for 90 percent of the TPS and the New Measures classification accounts for 10 percent of the TPS, which yields a result of 45. In this example, HHAs also reported all four numbers and would receive the full 10 points for the new measure. As a result, the TPS for HHA A would be 55 (45 plus 10). In addition, as specified in Section E:7—Weighting, all measures have equal weights regardless of their classification (except for New Measures) and the total earned points for the three classifications can be calculated by adding the points awarded for each such measure together. We seek public comment on our proposal of the minimum number of cases for outcome and clinical quality measures.

G. The Payment Adjustment Methodology

We propose to codify at 42 CFR 484.330 a methodology for applying value-based payment adjustments to home health services under the HHVBP model. Payment adjustments would be made to the HH PPS final claim payment amount as calculated in accordance with § 484.205 using a linear exchange function (LEF) similar to the methodology utilized by the HVBP Program. The LEF is used to translate an HHA's TPS into a percentage of the value-based payment adjustment earned by each HHA under the HHVBP model. The LEF was identified by the HVBP Program as the simplest and most straightforward option to provide the same marginal incentives to all hospitals, and we believe the same to be true for HHAs. We propose the function's intercept at zero percent, meaning those HHAs that have a TPS that is average in relationship to other HHAs in their cohort (a zero percent), would not receive any payment adjustment. Payment adjustments for each HHA with a score above zero percent would be determined by the slope of the LEF. In addition we propose to set the slope of the LEF for the first performance year, CY 2016, so that the estimated aggregate value-based payment adjustments for CY 2016 are equal to 5 percent of the estimated aggregate base operating episode payment amount for CY 2018. The estimated aggregate base operating episode payment amount is the total amount of episode payments made to all the HHAs by Medicare in each individual state in the larger- and smaller-volume cohorts respectively (we are proposing nine states, which would create 18 separate aggregate base operating episode payment amounts).

Figure 9 provides an example of how the LEF is calculated and how it is applied to calculate the percentage payment adjustment to a HHA's TPS. For this example, we applied the 8 percent payment adjustment level that is proposed for the final two years of the HHVBP model. The proposed rate for the payment adjustments for other years would be proportionally less.

Step #1 involves the calculation of the `Prior Year Aggregate HHA Payment Amount' (See C2 in Figure 9) that each HHA was paid in the prior year. From claims data, all payments are summed together for each HHA for CY 2015, the year prior to the HHVBP Model.

Step #2 involves the calculation of the `8 percent Payment Reduction Amount' (C3 of Figure 9) for each HHA. The `Prior Year Aggregate HHA Payment Amount' is multiplied by the `8 percent Payment Reduction Rate'. The aggregate of the `8-percent Payment Reduction Amount' is the numerator of the LEF.

Step #3 involves the calculation of the `Final TPS Adjusted Reduction Amount' (C4 of Figure 9) by multiplying the `8-percent Payment Reduction Amount' from Step #2 by the TPS (C1) divided by 100. The aggregate of the `TPS Adjusted Reduction Amount' is the denominator of the LEF.

Step #4 involves calculating the LEF (C5 of Figure 9) by dividing the aggregate `8 percent Payment Reduction Amount' by the aggregate `TPS Adjusted Reduction Amount'.

Step #5 involves the calculation of the `Final TPS Adjusted Payment Amount' (C6 of Figure 9) by multiplying the `TPS Adjusted Reduction Amount' (C4) by the LEF (C5). This is an intermediary value used to calculate `Quality Adjusted Payment Rate'.

Step #6 involves the calculation of the `Quality Adjusted Payment Rate' (C7 of Figure 9) that the HHA would receive instead of the 8 percent reduction in payment. This is an intermediary step to determining the payment adjustment rate. For CYs 2021 and 2022, the payment adjustment in this column would range from 0 percent to 16 percent depending on the quality of care provided.

Step #7 involves the calculation of the `Final Percent Payment Adjustment' (C8 of Figure 9) that would be applied to the HHA payments after the performance period. It simply involves the CY payment adjustment percent (in 2018, 5 percent; in 2019, 5 percent; in 2020, 6 percent; in 2021, 8 percent; and in 2022, 8 percent). In this example, we use the maximum eight-percent (8 percent) subtraction to the `Quality Adjusted Payment Rate'. Note that the payment adjustment percentage is capped at no more than plus or minus 8 percent for each respective performance period and the payment adjustment would occur on the final claim payment amount.

We invite public comments on this proposed payment adjustment methodology.

Figure 9—8-Percent Reduction Sample

Linear exchange function (LEF) (Sum of C3/Sum of C4)
HHA TPS Step 1 Step 2 Step 3 Step 4 Step 5 Step 6 Step 7
Prior year aggregate HHA payment * 8-Percent payment reduction amount (C2*8%) TPS adjusted reduction amount (C1/100)*C3 Final TPS adjusted payment amount (C4*C5) Quality adjusted payment rate (C6/C2) *100 % Final percent payment adjustment +/− (C7-8%) %
(C1) (C2) (C3) (C4) (C5) (C6) (C7) (C8)
HHA1 38 $ 100,000 $ 8,000 $ 3,040 1.93 $ 5,867 5.9 −2.1
HHA2 55 145,000 11,600 6,380 1.93 12,313 8.5 0.5
HHA3 22 800,000 64,000 14,080 1.93 27,174 3.4 −4.6
HHA4 85 653,222 52,258 44,419 1.93 85,729 13.1 5.1
HHA5 50 190,000 15,200 7,600 1.93 14,668 7.7 −0.3
HHA6 63 340,000 27,200 17,136 1.93 33,072 9.7 1.7
HHA7 74 660,000 52,800 39,072 1.93 75,409 11.4 3.4
HHA8 25 564,000 45,120 11,280 1.93 21,770 3.9 −4.1
Sum 276,178 143,007 276,002
* Example cases.

H. Preview and Period To Request Recalculation

We are proposing to provide HHAs two separate opportunities to review scoring information under the HHVBP model. First, HHAs will have the opportunity to review their quarterly quality reports following each quarterly posting; second, Medicare-certified HHAs will have the opportunity to review their TPS and payment adjustment calculations, and request a recalculation if a discrepancy is identified due to a CMS error as described in this section. These processes would also help educate and inform each competing Medicare-certified HHA on the direct relation between the payment adjustment and performance measure scores.

The proposed model design calls for us to inform HHA quarterly of their performance on each of the individual quality measures used to calculate the TPS. We propose that HHAs will have 10 days after the quarterly reports are provided to request a recalculation of a measure scores if it believes there is evidence of a discrepancy. We would adjust the score if it is determined that the discrepancy in the calculated measure scores was the result of our failure to follow measurement calculation protocols.

In addition, the proposed model design also calls for us to inform each Medicare-certified HHA of the TPS and payment adjustment amount in an annual report. We propose that these annual reports be provided to Medicare-certified HHAs each August prior to the calendar year for which the payment adjustment would be applied. Similar to quarterly reports, HHAs will have 10 days to request a recalculation of their TPS and payment adjustment amount from the date information is made available. For both the quarterly reports and the annual report containing the TPS and payment adjustments, Medicare-certified HHAs will only be permitted to request scoring recalculations, and must include a specific basis for the requested recalculation. We will not be responsible for providing HHAs with the underlying source data utilized to generate performance measure scores. Each HHA has access to this data via the QIES system. The final TPS and payment adjustment would then be provided to competing Medicare-certified HHAs in a final report no later than 60 days in advance of the payment adjustment taking effect.

The TPS from the annual performance report would be calculated based on the calculation of performance measures contained in the quarterly reports that have already been provided and reviewed by the HHAs. As a result, we believe that quarterly reviews would provide substantial opportunity to identify and correct errors and resolve discrepancies, thereby minimizing the challenges to the annual performance scores linked to payment adjustment.

As described above, a quarterly performance report would be provided to all Medicare-certified HHAs within the selected states beginning with the first quarter of CY 2016 being reported in July 2016. We propose that HHAs would submit recalculation requests for both quarterly quality performance measure reports and for the TPS and payment adjustment reports via an email link provided on the model-specific Web page. The request form would be entered by a person who has authority to sign on behalf of the HHA and be submitted within 10 days of receiving the quarterly data report or the annual TPS and payment adjustment report.

Requests for both quarterly report measure score recalculations or TPS and payment adjustment recalculations would contain the following information:

  • The provider's name, address associated with the services delivered, and CMS Certification Number (CCN);
  • The basis for requesting recalculation to include the specific quality measure data that the HHA believes is inaccurate or the calculation the HHA believes is incorrect;
  • Contact information for a person at the HHA with whom CMS or its agent can communicate about this request, including name, email address, telephone number, and mailing address (must include physical address, not just a post office box); and,
  • A copy of any supporting documentation the HHA wishes to submit in electronic form via the model-specific Web page.

Following receipt of a request for quarterly report measure score recalculations or a request for TPS and payment adjustment recalculation, CMS or its agent would:

+ Provide an email acknowledgement, using the contact information provided in the recalculation request, to the HHA contact notifying the HHA that the request has been received;

+ Review the request to determine validity, and determine whether the requested recalculation would result in a score change altering performance measure scores or the HHA's TPS;

+ If recalculation would result in a performance measure score or TPS change, conduct a review of quality data and if an error is found, recalculate the TPS using the corrected performance data; and,

+ Provide a formal response to the HHA contact, using the contact information provided in the recalculation request, notifying the HHA of the outcome of the review and recalculation process.

Recalculation and subsequent communication of the results of these determinations would occur as soon as administratively feasible following the submission of requests. We request comment on our proposed quarterly quality report measure review, TPS preview period, and our proposed process for requesting recalculation of the quarterly performance measure scores, and the TPS and payment adjustment. We intend to codify these processes in regulation text in future rulemaking.

Additionally, we will develop and adopt an appeals mechanism under the model through future rulemaking in advance of the application of any payment adjustments.

I. Evaluation

We propose to codify at 484.315(c) that HHAs in selected states would be required to collect and report information to CMS necessary for the purposes of monitoring and evaluating this model as required by statute. We plan to conduct an evaluation of the proposed HHVBP model in accordance with section 1115A(b)(4) of the Act, which requires the Secretary to evaluate each model tested by CMMI. We consider an independent evaluation of the model to be necessary to understand its impacts on care quality in the home health setting. The evaluation would be focused primarily on understanding how successful the model is in achieving quality improvement as evidenced by HHAs' performance on clinical care process measures, clinical outcome measures (for example, functional status), utilization/outcome measures (for example, hospital readmission rates, emergency room visits), access to care, and patient's experience of care, and Medicare costs. We also intend to examine the likelihood of unintended consequences. We intend to select an independent evaluation contractor to perform this evaluation. However, because the procurement for the selection of the evaluation contractor is in progress and is subject to the finalization of the proposed model, we cannot provide a detailed description of the evaluation methodology here.

See 1115A(b)(4) of the Act (42 U.S.C. 1315a).

We intend to use a multilevel approach to evaluation. Here, we intend to conduct analyses at the state, HHA, and patient levels. Based on the state groupings discussed in the section on selection of Medicare certified HHAs, we believe there are several ways in which we can draw comparison groups and remain open to scientifically-sound, rigorous methods for evaluating the effect of the model intervention.

The evaluation effort may require of HHAs participating in the Model additional data specifically for evaluation purposes. Such requirements for additional data to carry out model evaluation would be in compliance with 42 CFR 403.1105 which, as of January 1, 2015, requires entities participating in the testing of a model under section 1115A to collect and report such information, including protected health information (as defined at 45 CFR 160.103), as the Secretary determines is necessary to monitor and evaluate the model. We would consider all Medicare-certified HHAs providing services within a state selected for the Model to be participating in the testing of this model because the competing HHAs would be receiving payment from CMS under the model.

79 FR 67751 through 67755.

We invite public comments on this proposed evaluation plan.

V. Proposed Provisions of the Home Health Care Quality Reporting Program (HH QRP)

A. Background and Statutory Authority

Section 1895(b)(3)(B)(v)(II) of the Act requires that for 2007 and subsequent years, each HHA submit to the Secretary in a form and manner, and at a time, specified by the Secretary, such data that the Secretary determines are appropriate for the measurement of health care quality. To the extent that an HHA does not submit data in accordance with this clause, the Secretary is directed to reduce the home health market basket percentage increase applicable to the HHA for such year by 2 percentage points. As provided at section 1895(b)(3)(B)(vi) of the Act, depending on the market basket percentage for a particular year, the 2 percentage point reduction under section 1895(b)(3)(B)(v)(I) of the Act may result in this percentage increase, after application of the productivity adjustment under section 1895(b)(3)(B)(vi)(I) of the Act, being less than 0.0 percent for a year, and may result in payment rates under the Home Health PPS for a year being less than payment rates for the preceding year.

Section 2(a) of the Improving Medicare Post-Acute Care Transformation Act of 2014 (the IMPACT Act) (Pub. L. 113-185, enacted on Oct. 6, 2014) amended Title XVIII of the Act, in part, by adding a new section 1899B, which imposes new data reporting requirements for certain post-acute care (PAC) providers, including HHAs. New section 1899B of the Act is titled, “Standardized Post-Acute Care (PAC) Assessment Data for Quality, Payment, and Discharge Planning”. Under section 1899B(a)(1) of the Act, certain post-acute care (PAC) providers (defined in section 1899B(a)(2)(A) of the Act to include HHAs, SNFs, IRFs, and LTCHs) must submit standardized patient assessment data in accordance with section 1899B(b) of the Act, data on quality measures required under section 1899B(c)(1) of the Act, and data on resource use, and other measures required under section 1899B(d)(1) of the Act. The Act also sets out specified application dates for each of the measures. The Secretary must specify the quality, resource use, and other measures no later than the applicable specified application date defined in section 1899B(a)(2)(E) of the Act.

Section 1899B(b) of the Act describes the standardized patient assessment data that PAC providers are required to submit in accordance with section 1899B(b)(1) of the Act; requires the Secretary, to the extent practicable, to match claims data with standardized patient assessment data in accordance with section 1899B(b)(2) of the Act; and requires the Secretary, as soon as practicable, to revise or replace existing patient assessment data to the extent that such data duplicate or overlap with standardized patient assessment data, in accordance with section 1899B(b)(3) of the Act.

Sections 1899B(c)(1) and (d)(1) of the Act direct the Secretary to specify measures that relate to at least five stated quality domains and three stated resource use and other measure domains. Section 1899B(c)(1) of the Act provides that the quality measures on which PAC providers, including HHAs, are required to submit standardized patient assessment data and other necessary data specified by the Secretary must be in accordance with, at least, the following domains:

  • Functional status, cognitive function, and changes in function and cognitive function;
  • Skin integrity and changes in skin integrity;
  • Medication reconciliation;
  • Incidence of major falls; and
  • Accurately communicating the existence of and providing for the transfer of health information and care preferences of an individual to the individual, family caregiver of the individual, and providers of services furnishing items and services to the individual when the individual transitions (1) from a hospital or Critical Access Hospital (CAH) to another applicable setting, including a PAC provider or the home of the individual, or (2) from a PAC provider to another applicable setting, including a different PAC provider, hospital, CAH, or the home of the individual.

Section 1899B(c)(2)(A) provides that, to the extent possible, the Secretary must require such reporting through the use of a PAC assessment instrument and modify the instrument as necessary to enable such use.

Section 1899B(d)(1) of the Act provides that the resource use and other measures on which PAC providers, including HHAs, are required to submit any necessary data specified by the Secretary, which may include standardized assessment data in addition to claims data, must be in accordance with, at least, the following domains:

  • Resource use measures, including total estimated Medicare spending per beneficiary;
  • Discharge to community; and
  • Measures to reflect all-condition risk-adjusted potentially preventable hospital readmission rates.

Sections 1899B(c) and (d) of the Act indicate that data satisfying the eight measure domains in the IMPACT Act is the minimum data reporting requirement. Therefore, the Secretary may specify additional measures and additional domains.

Section 1899B(e)(1) of the Act requires that the Secretary implement the quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) of the Act in phases consisting of measure specification, data collection, and data analysis; the provision of feedback reports to PAC providers in accordance with section 1899B(f) of the Act; and public reporting of PAC providers' performance on such measures in accordance with section 1899B(g) of the Act. Section 1899B(e)(2) of the Act generally requires that each measure specified by the Secretary under section 1899B of the Act be NQF-endorsed, but authorizes an exception under which the Secretary may select non-NQF-endorsed quality measures in the case of specified areas or medical topics determined appropriate by the Secretary for which a feasible or practical measure has not been endorsed by the NQF, as long as due consideration is given to measures that have been endorsed or adopted by a consensus organization identified by the Secretary. Section 1899B(e)(3) of the Act provides that the pre-rulemaking process required by section 1890A of the Act applies to quality, resource use, and other measures specified under sections 1899B(c)(1) and (d)(1) of the Act, but authorizes exceptions under which the Secretary may (1) use expedited procedures, such as ad hoc reviews, as necessary in the case of a measure required with respect to data submissions during the 1-year period before the applicable specified application date, or (2) alternatively, waive section 1890A of the Act in the case of such a measure if applying section 1890A of the Act (including through the use of expedited procedures) would result in the inability of the Secretary to satisfy any deadline specified under section 1899B of the Act with respect to the measure.

Section 1899B(f)(1) of the Act requires the Secretary to provide confidential feedback reports to PAC providers on the performance of such PAC providers with respect to quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) of the Act beginning 1 year after the applicable specified application date.

Section 1899B(g) of the Act requires the Secretary to establish procedures for making available to the public information regarding the performance of individual PAC providers with respect to quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) beginning not later than 2 years after the applicable specified application date. The procedures must ensure, including through a process consistent with the process applied under section 1886(b)(3)(B)(viii)(VII) for similar purposes, that each PAC provider has the opportunity to review and submit corrections to the data and information that are to be made public with respect to the PAC provider prior to such data being made public.

Section 1899B(h) of the Act sets out requirements for removing, suspending, or adding quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) of the Act. In addition, section 1899B(j) of the Act requires the Secretary to allow for stakeholder input, such as through town halls, open door forums, and mailbox submissions, before the initial rulemaking process to implement section 1899B of the Act.

Section 2(c)(1) of the IMPACT Act amended section 1895 of the Act to address the payment consequences for HHAs with respect to the additional data which HHAs are required to submit under section 1899B of the Act. These changes include the addition of a new section 1895(3)(B)(v)(IV), which requires HHAs to submit the following additional data: (1) For the year beginning on the applicable specified application date and subsequent years, data on the quality, resource use, and other measures required under sections 1899B(c)(1) and (d)(1) of the Act; and (2) for 2019 and subsequent years, the standardized patient assessment data required under section 1899B(b)(1) of the Act. Such data must be submitted in the form and manner, and at the time, specified by the Secretary.

As stated above, the IMPACT Act adds a new section 1899B that imposes new data reporting requirements for certain post-acute care (PAC) providers, including HHAs. Sections 1899B(c)(1) and 1899B(d)(1) collectively require that the Secretary specify quality measures and resource use and other measures with respect to certain domains not later than the specified application date that applies to each measure domain and PAC provider setting. Section 1899B(a)(2)(E) delineates the specified application dates for each measure domain and PAC provider. The IMPACT Act also amends other sections of the Act, including section 1895(b)(3)(B)(v), to require the Secretary to reduce the otherwise applicable PPS payment to a PAC provider that does not report the new data in a form and manner, and at a time, specified by the Secretary. For HHAs, amended section 1895(b)(3)(B)(v) would require the Secretary to reduce the payment update for any HHA that does not satisfactorily submit the new required data.

Under the current HH QRP, the general timeline and sequencing of measure implementation occurs as follows: Specification of measures; proposal and finalization of measures through notice-and-comment rulemaking; HHA submission of data on the adopted measures; analysis and processing of the submitted data; notification to HHAs regarding their quality reporting compliance with respect to a particular year; consideration of any reconsideration requests; and imposition of a payment reduction in a particular year for failure to satisfactorily submit data with respect to that year. Any payment reductions that are taken with respect to a year begin approximately 1 year after the end of the data submission period for that year and approximately 2 years after we first adopt the measure.

To the extent that the IMPACT Act could be interpreted to shorten this timeline, so as to require us to reduce HH PPS payment for failure to satisfactorily submit data on a measure specified under section 1899B(c)(1) or (d)(1) of the IMPACT Act beginning with the same year as the specified application date for that measure, such a timeline would not be feasible. The current timeline discussed above reflects operational and other practical constraints, including the time needed to specify and adopt valid and reliable measures, collect the data, and determine whether a HHA has complied with our quality reporting requirements. It also takes into consideration our desire to give HHAs enough notice of new data reporting obligations so that they are prepared to timely start reporting data. Therefore, we intend to follow the same timing and sequence of events for measures specified under sections 1899B(c)(1) and (d)(1) of the Act that we currently follow for other measures specified under the HH QRP. We intend to specify each of these measures no later than the specified application dates set forth in section 1899B(a)(2)(E) of the Act and propose to adopt them consistent with the requirements in the Act and Administrative Procedure Act. To the extent that we finalize a proposal to adopt a measure for the HH QRP that satisfies an IMPACT Act measure domain, we intend to require HHAs to report data on the measure for the year that begins 2 years after the specified application date for that measure. Likewise, we intend to require HHAs to begin reporting any other data specifically required under the IMPACT Act for the year that begins 2 years after we adopt requirements that would govern the submission of that data.

Lastly, on April 1, 2014, the Congress passed the Protecting Access to Medicare Act of 2014 (PAMA) (Pub. L. 113-93), which stated the Secretary may not adopt ICD-10 prior to October 1, 2015. On August 4, 2014, HHS published a final rule titled “Administrative Simplification: Change to the Compliance Date for the International Classification of Diseases, 10th Revision (ICD-10-CM and ICD-10-PCS Medical Data Code Sets” (79 FR 45128), which announced October 1, 2015 as the new compliance date. The OASIS-C1 data item set had been previously approved by the Office of Management and Budget (OMB) on February 6, 2014 and scheduled for implementation on October 1, 2014. We intended to use the OASIS-C1 to coincide with the original implementation date of the ICD-10. The approved OASIS-C1 included changes to accommodate coding of diagnoses using the ICD-10-CM coding set and other important stakeholder concerns such as updating clinical concepts, and revised item wording and response categories to improve item clarity. This version included five (5) data items that required the use of ICD-10 codes.

Since OASIS-C1 was revised to incorporate ICD-10 coding, it is not feasible to implement the OASIS-C1/ICD-10 version prior to October 1, 2015, when ICD-10 is scheduled to be implemented. Due to this delay, we had to ensure the collection and submission of OASIS data continued, until ICD-10 could be implemented. Therefore, we have made interim changes to the OASIS-C1 data item set to allow use with ICD-9 until ICD-10 is adopted. The OASIS-C1/ICD-9 version was submitted to OMB for approval until the OASIS-C1/ICD-10 version could be implemented. A 6-month emergency approval was granted on October 7, 2014 and CMS subsequently applied for an extension. The extension of the OASIS-C1/ICD-9 version was reapproved under OMB control number 0938-0760 with a current expiration date of March 31, 2018. It is important to note, that this version of the OASIS will be discontinued once the OASIS-C1/ICD-10 version is approved and implemented. In addition, to facilitate the reporting of OASIS data as it relates to the planned implementation of ICD-10 on October 1, 2015, we submitted a new request for approval to OMB for the OASIS-C1/ICD-10 version under the Paperwork Reduction Act (PRA) process. We are requesting a new OMB control number for the proposed revised OASIS item as announced in the 30-day Federal Register notice (80 FR 15797). The new information collection request is currently pending OMB approval. Information regarding the OASIS-C1 can be located at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/OASIS-C1.html. Additional information regarding the adoption of ICD-10 can be located at http://www.cms.gov/Medicare/Coding/ICD10/index.html?redirect=/icd10.

B. General Considerations Used for the Selection of Quality Measures for the HH QRP

We strive to promote high quality and efficiency in the delivery of health care to the beneficiaries we serve. Performance improvement leading to the highest quality health care requires continuous evaluation to identify and address performance gaps and reduce the unintended consequences that may arise in treating a large, vulnerable, and aging population. Quality reporting programs, coupled with public reporting of quality information, are critical to the advancement of health care quality improvement efforts.

We seek to adopt measures for the HH QRP that promotes better, safer, and more efficient care. Valid, reliable, relevant quality measures are fundamental to the effectiveness of our quality reporting programs. Therefore, selection of quality measures is a priority for CMS in all of its quality reporting programs.

The measures selected would address the measure domains as specified in the IMPACT Act and would be in alignment with the CMS Quality Strategy, which is framed using the three broad aims of the National Quality Strategy:

  • Better Care: Improve the overall quality of care by making healthcare more patient-centered, reliable, accessible, and safe.
  • Healthy People, Healthy Communities: Improve the health of the U.S. population by supporting proven interventions to address behavioral, social, and environmental determinants of health in addition to delivering higher-quality care.
  • Affordable Care: Reduce the cost of quality healthcare for individuals, families, employers, and government.

In addition, our measure selection activities for the HH QRP take into consideration input we receive from the Measure Applications Partnership (MAP), convened by the NQF, as part of the established CMS pre-rulemaking process required under section 1890A of the Act. The MAP is a public-private partnership comprised of multi-stakeholder groups convened for the primary purpose of providing input to us on the selection of certain categories of quality and efficiency measures, as required by section 1890A(a)(3) of the Social Security Act (the Act). By February 1st of each year, the NQF must provide that input to us. Input from the MAP is located at http://www.qualityforum.org/Setting_Priorities/Partnership/Measure_Applications_Partnership.aspx. In addition, we take into account national priorities, such as those established by the National Priorities Partnership at http://www.qualityforum.org/npp/,, and the HHS Strategic Plan at http://www.hhs.gov/secretary/about/priorities/priorities.html.

We initiated an Ad Hoc MAP process for the review of the measures under consideration for implementation in preparation of the measures for adoption into the HH QRP that we must propose through this fiscal year's rule, in order to begin implementing such measures by 2017. We included under the List of Measures under Consideration (MUC List) a list of measures that the Secretary must make available to the public, as part of the pre-rulemaking process, as described in section 1890A(a)(2) of the Act. The MAP Off-Cycle Measures under Consideration for PAC-LTC Settings can be accessed on the National Quality Forum Web site at: http://www.qualityforum.org/map/. The NQF MAP met in February 2015 and provided input to us as required under section 1890A(a)(3) of the Act. The MAP issued a pre-rulemaking report on March 6, 2015 entitled MAP Off-Cycle Deliberations 2015: Measures under Consideration to Implement Provisions of the IMPACT Act—Final Report, which is available for download at: http://www.qualityforum.org/Publications/2015/03/MAP_Off-Cycle_Deliberations_2015_-_Final_Report.aspx. The MAP's input for the proposed measure is discussed in this section.

To meet the first specified application date applicable to HHAs under section 1899B(a)(2)(E) of the Act, which is October 1, 2017, we have focused on measures that:

  • Correspond to a measure domain in sections 1899B(c)(1) or (d)(1) of the Act and are setting-agnostic: For example falls with major injury and the incidence of pressure ulcers;
  • Are currently adopted for 1 or more of our PAC quality reporting programs, are already either NQF-endorsed and in use or finalized for use, or already previewed by the Measure Applications Partnership (MAP) with support;
  • Minimize added burden on HHAs;
  • Minimize or avoid, to the extent feasible, revisions to the existing items in assessment tools currently in use (for example, the OASIS); and
  • Where possible, the avoidance duplication of existing assessment items.

In our selection and specification of measures, we employ a transparent process in which we seek input from stakeholders and national experts and engage in a process that allows for pre-rulemaking input on each measure, as required by section 1890A of the Act. This process is based on a private public partnership, and it occurs via the MAP. The MAP is composed of multistakeholder groups convened by the NQF, our current contractor under section 1890 of the Act, to provide input on the selection of quality and efficiency measures described in section 1890(b)(7)(B). The NQF must convene these stakeholders and provide us with the stakeholders' input on the selection of such measures. We, in turn, must take this input into consideration in selecting such measures. In addition, the Secretary must make available to the public by December 1 of each year a list of such measures that the Secretary is considering under Title XVIII of the Act. As discussed in section V.A. of this proposed rule 1899B(e)(3) provides that the pre-rulemaking process required by section 1890A of the Act applies to the measures required under section 1899B, subject to certain exceptions for expedited procedures or, alternatively, waiver of section 1890A. We initiated an ad hoc MAP process for the review of the quality measures under consideration for proposal, in preparation for adoption of those quality measures into the HH QRP that are required by the IMPACT Act, and that must be implemented by January 1, 2017. The List of Measures under Consideration (MUC List) under the IMPACT Act was made public on February 5, 2015. Under the IMPACT Act, these measures must be standardized so they can be applied across PAC settings and must correspond to measure domains specified in sections 1899B(c)(1) and (d)(1) of the IMPACT Act. The MAP reviewed each IMPACT Act-related quality measure proposed in this proposed rule for the HH QRP, in light of its intended cross-setting use. We refer to sections V.A. and V.C. of this proposed rule for more information on the MAP's recommendations. The MAP's final report, MAP Off-Cycle Deliberations 2015: Measures under Consideration to Implement Provisions of the IMPACT Act: Final Report, is available at http://www.qualityforum.org/Setting _Priorities/Partnership/MAP_Final_Reports.aspx. As discussed in section V.A. of this proposed rule, section 1899B(j) of the Act, requires that we allow for stakeholder input, such as through town halls, open door forums, and mailbox submissions, before the initial rulemaking process to implement section 1899B. To meet this requirement, we provided the following opportunities for stakeholder input: (a) We convened a technical expert panel (TEP) that included stakeholder experts and patient representatives on February 3, 2015; (b) we provided two separate listening sessions on February 10th and March 24, 2015; (c) we sought public input during the February 2015 ad hoc MAP process regarding the measures under consideration with respect to IMPACT Act domains; (d) we sought public comment as part of our measure maintenance work; and (e) we implemented a public mail box for the submission of comments in January, 2015 located at PACQualityInitiative@cms.hhs.gov. The CMS public mailbox can be accessed on our post-acute care quality initiatives Web site: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/IMPACT-Act-of-2014-and-Cross-Setting-Measures.html. Lastly, we held a National Stakeholder Special Open Door Forum to seek input on the measures on February 25, 2015.

In the absence of NQF endorsement on measures for the home health setting, or measures that are not fully supported by the MAP for the HH QRP, we intend to propose for adoption measures that most closely align with the national priorities discussed above and for which the MAP supports the measure concept. Further discussion as to the importance and high-priority status of these measures in the HH setting is included under each quality measure proposal in this proposed rule. In addition, for measures not endorsed by the NQF, we have sought, to the extent practicable, to adopt measures that have been endorsed or adopted by a national consensus organization, recommended by multi-stakeholder organizations, and/or developed with the input of providers, purchasers/payers, and other stakeholders.

C. HH QRP Quality Measures and Measures Under Consideration for Future Years

In the CY 2014 HH PPS final rule, (78 FR 72256-72320), we finalized a proposal to add two claims-based measures to the HH QRP, and stated that we would begin reporting the data from these measures to HHAs beginning in CY 2014. These claims based measures are: (1) Rehospitalization during the first 30 days of HH; and (2) Emergency Department Use without Hospital Readmission during the first 30 days of HH. In an effort to align with other updates to Home Health Compare, including the transition to quarterly provider preview reports, we have made the decision to delay the reporting of data from these measures until July 2015 ( http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQISpotlight.html ). Also in that rule, we finalized our proposal to reduce the number of process measures reported on the Certification and Survey Provider Enhanced Reporting (CASPER) reports by eliminating the stratification by episode length for nine (9) process measures. The removal of these measures from the CASPER folders occurred in October 2014. The CMS Home Health Quality Initiative Web site identifies the current HH QRP measures located at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIQualityMeasures.html. In addition, as stated in the CY 2012 and CY 2013 HH PPS final rules (76 FR 68575 and 77 FR 67093, respectively), we finalized that we will also use measures derived from Medicare claims data to measure home health quality. This effort ensures that providers do not have an additional burden of reporting quality of care measures through a separate mechanism, and that the costs associated with the development and testing of a new reporting mechanism are avoided.

(a) We are proposing one standardized cross-setting new measure for CY 2016 to meet the requirements of the IMPACT Act. The proposed quality measure that addresses the domain of skin integrity and changes in skin integrity is the National Quality Forum (NQF)-endorsed measure: Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short Stay) (NQF #0678) ( http://www.qualityforum.org/QPS/0678 ).

The IMPACT Act requires the specification of a quality measure to address skin integrity and changes in skin integrity in the home health setting by January 1, 2017. We are proposing the implementation of the quality measure NQF #0678, Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay) in the HH QRP as a cross-setting quality measure to meet the requirements of the IMPACT Act for the CY 2018 payment determination and subsequent years. This measure reports the percent of patients with Stage 2 through 4 pressure ulcers that are new or worsened since the beginning of the episode of care.

Pressure ulcers are high-volume in post-acute care settings and high-cost adverse events. According to the 2014 Prevention and Treatment Guidelines published by the National Pressure Ulcer Advisory Panel, European Pressure Ulcer Advisory Panel, and Pan Pacific Pressure Injury Alliance, pressure ulcer care is estimated to cost approximately $11 billion annually, and between $500 and $70,000 per individual pressure ulcer. Pressure ulcers are a serious medical condition that result in pain, decreased quality of life, and increased mortality in aging populations. Pressure ulcers typically are the result of prolonged periods of uninterrupted pressure on the skin, soft tissue, muscle, and bone. Elderly individuals are prone to a wide range of medical conditions that increase their risk of developing pressure ulcers. These include impaired mobility or sensation, malnutrition or undernutrition, obesity, stroke, diabetes, dementia, cognitive impairments, circulatory diseases, dehydration, bowel or bladder incontinence, the use of wheelchairs, the use of medical devices, polypharmacy, and a history of pressure ulcers or a pressure ulcer at admission.

National Pressure Ulcer Advisory Panel, European Pressure Ulcer Advisory Panel and Pan Pacific Pressure Injury Alliance. Prevention and Treatment of Pressure Ulcers: Clinical Practice Guideline. Emily Haesler (Ed.) Cambridge Media; Osborne Park, Western Australia; 2014.

Casey, G. (2013). “Pressure ulcers reflect quality of nursing care.” Nurs N Z 19(10): 20-24.

Gorzoni, M. L., and S. L. Pires (2011). “Deaths in nursing homes.” Rev Assoc Med Bras 57(3): 327-331.

Thomas, J. M., et al. (2013). “Systematic review: health-related characteristics of elderly hospitalized adults and nursing home residents associated with short-term mortality.” J Am Geriatr Soc 61(6): 902-911.

White-Chu, E. F., et al. (2011). “Pressure ulcers in long-term care.” Clin Geriatr Med 27(2): 241-258.

Bates-Jensen BM. Quality indicators for prevention and management of pressure ulcers in vulnerable elders. Ann Int Med. 2001;135 (8 Part 2), 744-51.

Institute for Healthcare Improvement (IHI). Relieve the pressure and reduce harm. May 21, 2007. Available from http://www.ihi.org/IHI/Topics/PatientSafety/SafetyGeneral/ImprovementStories/FSRelievethePressureandReduceHarm.htm .

Russo CA, Steiner C, Spector W. Hospitalizations related to pressure ulcers among adults 18 years and older, 2006 (Healthcare Cost and Utilization Project Statistical Brief No. 64). December 2008. Available from http://www.hcupus.ahrq.gov/reports/statbriefs/sb64.pdf.

Agency for Healthcare Research and Quality (AHRQ). Agency news and notes: pressure ulcers are increasing among hospital patients. January 2009. Available from http://www.ahrq.gov/research/jan09/0109RA22.htm.= .

Bates-Jensen BM. Quality indicators for prevention and management of pressure ulcers in vulnerable elders. Ann Int Med. 2001;135 (8 Part 2), 744-51.

Cai, S., et al. (2013). “Obesity and pressure ulcers among nursing home residents.” Med Care 51(6): 478-486.

Casey, G. (2013). “Pressure ulcers reflect quality of nursing care.” Nurs N Z 19(10): 20-24.

Hurd D, Moore T, Radley D, Williams C. Pressure ulcer prevalence and incidence across post-acute care settings. Home Health Quality Measures & Data Analysis Project, Report of Findings, prepared for CMS/OCSQ, Baltimore, MD, under Contract No. 500-2005-000181 TO 0002. 2010.

MacLean DS. Preventing & managing pressure sores. Caring for the Ages. March 2003;4(3):34-7. Available from http://www.amda.com/publications/caring/march2003/policies.cfm.

Michel, J. M., et al. (2012). “As of 2012, what are the key predictive risk factors for pressure ulcers? Developing French guidelines for clinical practice.” Ann Phys Rehabil Med 55(7): 454-465.

National Pressure Ulcer Advisory Panel (NPUAP) Board of Directors; Cuddigan J, Berlowitz DR, Ayello EA (Eds). Pressure ulcers in America: prevalence, incidence, and implications for the future. An executive summary of the National Pressure Ulcer Advisory Panel Monograph. Adv Skin Wound Care. 2001;14(4):208-15.

Park-Lee E, Caffrey C. Pressure ulcers among nursing home residents: United States, 2004 (NCHS Data Brief No. 14). Hyattsville, MD: National Center for Health Statistics, 2009. Available from http://www.cdc.gov/nchs/data/databriefs/db14.htm .

Reddy, M. (2011). “Pressure ulcers.” Clin Evid (Online) 2011.

Teno, J. M., et al. (2012). “Feeding tubes and the prevention or healing of pressure ulcers.” Arch Intern Med 172(9): 697-701.

The IMPACT Act requires the specification of quality measures that are harmonized across PAC settings. This requirement is consistent with the NQF Steering Committee report, which stated that to understand the impact of pressure ulcers across settings, quality measures addressing prevention, incidence, and prevalence of pressure ulcers must be harmonized and aligned. NQF #0678, Percent of Residents or Patients with Pressure Ulcers That Are New or Worsened (Short Stay) is NQF-endorsed and has been successfully implemented using a harmonized set of data elements in IRF, LTCH, and SNF settings. A new item, M1309 was added to the OASIS-C1/ICD-9 version to collect data on new and worsened pressure ulcers in home health patients to support harmonization with NQF #0678; data collection for this item began January 1, 2015. A new measure, based on this item, was included in the 2014 MUC list and received conditional endorsement from the National Quality Forum. That measure was harmonized with NQF #0678, but differed in the consideration of unstageable pressure ulcers. In this rule, we are proposing a HH measure that is fully-standardized with NQF #0678.

National Quality Forum. National voluntary consensus standards for developing a framework for measuring quality for prevention and management of pressure ulcers. April 2008. Available from http://www.qualityforum.org/Projects/Pressure_Ulcers.aspx.

A TEP convened by our measure development contractor provided input on the technical specifications of this quality measure, including the feasibility of implementing the measure across PAC settings. The TEP was supportive of the implementation of this measure across PAC settings and applauded CMS's efforts to standardize this measure for cross-setting development. Additionally, the NQF MAP met on February 9, 2015 and February 27, 2015 and provided input to CMS. The MAP supported the use of NQF #0678, Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay) in the HH QRP as a cross-setting quality measure implemented under the IMPACT Act. More information about the MAPs recommendations for this measure is available at http://www.qualityforum.org/map/.

We propose that data for the standardized quality measure would be collected using the OASIS-C1 with submission through the Quality Improvement and Evaluation System (QIES) Assessment Submission and Processing (ASAP) system. HHAs began submitting data in January 2015 for the OASIS items used to calculate NQF #0678, the Percent of Residents, or Patients with Pressure Ulcers That Are New or Worsened (Short Stay), as part of the Home Health Quality Initiative to assess the number of new or worsened pressure ulcers in January 2015. By building on the existing reporting and submission infrastructure for HHAs, we intend to minimize the administrative burden related to data collection and submission for this measure under the HH QRP. For more information on HH reporting using the QIES ASAP system, refer to: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIOASISUserManual.html and http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/OASIS/index.html?redirect=/oasis/.

Data collected through the OASIS-C1 would be used to calculate this quality measure. Data items in the OASIS-C1 include M1308 (Current Number of Unhealed Pressure Ulcers at Each Stage or Unstageable) and M1309 (Worsening in Pressure Ulcer Status Since SOC/ROC). Data collected through the OASIS-C1 would be used for risk adjustment of this measure. We anticipate risk adjustment items would include, but is not limited to M1850 (Activities of Daily Living Assistance, Transferring), and M1620 (Bowel Incontinence Frequency). OASIS C1 items M1016 (Diagnoses Requiring Medical or Treatment Change Within past 14 Days), M1020 (Primary Diagnoses) and M1022 (Other Diagnoses) would be used to identify patients with a diagnosis of peripheral vascular disease, diabetes, or malnutrition. More information about the OASIS items is available in the OASIS Manual http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIOASISUserManual.html.

The calculation of the proposed measure would be based on the items M1308 (Current Number of Unhealed Pressure Ulcers at Each Stage or Unstageable) and M1309 (Worsening in Pressure Ulcer Status Since SOC/ROC). The specifications and data items for NQF #0678, the Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay), are available at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/PAC-Quality-Initiatives.html.

We invite public comment on our proposal to adopt NQF #0678 Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay) for the HH QRP to fulfill the timeline requirements for implementation under the IMPACT Act, for CY2018 HH payment determination and subsequent years.

As part of our ongoing measure development efforts, we are considering a future update to the numerator of the quality measure NQF #0678, Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay). This update would hold providers accountable for the development of unstageable pressure ulcers and suspected deep tissue injuries (sDTIs). Under this proposed change the numerator of the quality measure would be updated to include unstageable pressure ulcers, including sDTIs that are new/developed while the patient is receiving home health care, as well as Stage 1 or 2 pressure ulcers that become unstageable due to slough or eschar (indicating progression to a full thickness [that is, stage 3 or 4] pressure ulcer) after admission. This would be consistent with the specifications of the “New and Worsened Pressure Ulcer” measure for HH patients presented to the MAP on the 2014 MUC list. At this time, we are not proposing the implementation of this change (that is, including sDTIs and unstageable pressure ulcers in the numerator) in the HH QRP, but are soliciting public feedback on this potential area of measure development.

Our measure development contractor convened a cross-setting pressure ulcer TEP that strongly recommended that CMS hold providers accountable for the development of new unstageable pressure ulcers and sDTIs by including these pressure ulcers in the numerator of the quality measure. Although the TEP acknowledged that unstageable pressure ulcers and sDTIs cannot and should not be assigned a numeric stage, panel members recommended that these be included in the numerator of NQF #0678, the Percent of Residents, or Patients with Pressure Ulcers That Are New or Worsened (Short Stay), as a new pressure ulcer if developed during a home health episode. The TEP also recommended that a Stage 1 or 2 pressure ulcer that becomes unstageable due to slough or eschar should be considered worsened because the presence of slough or eschar indicates a full thickness (equivalent to Stage 3 or 4) wound. These recommendations were supported by technical and clinical advisors and the National Pressure Ulcer Advisory Panel. Additionally, exploratory data analysis conducted by our measure development contractor suggests that the addition of unstageable pressure ulcers, including sDTIs, would increase the observed incidence of new or worsened pressure ulcers at the agency level and may improve the ability of the quality measure to discriminate between poor- and high-performing facilities.

Schwartz, M., Nguyen, K.H., Swinson Evans, T.M., Ignaczak, M.K., Thaker, S., and Bernard, S.L.: Development of a Cross-Setting Quality Measure for Pressure Ulcers: OY2 Information Gathering, Final Report. Centers for Medicare & Medicaid Services, November 2013. Available: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/Downloads/Development-of-a-Cross-Setting-Quality-Measure-for-Pressure-Ulcers-Information-Gathering-Final-Report.pdf

Schwartz, M., Ignaczak, M.K., Swinson Evans, T.M., Thaker, S., and Smith, L.: The Development of a Cross-Setting Pressure Ulcer Quality Measure: Summary Report on November 15, 2013, Technical Expert Panel Follow-Up Webinar. Centers for Medicare & Medicaid Services, January 2014. Available: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/Downloads/Development-of-a-Cross-Setting-Pressure-Ulcer-Quality-Measure-Summary-Report-on-November-15-2013-Technical-Expert-Pa.pdf

Schwartz, M., Nguyen, K.H., Swinson Evans, T.M., Ignaczak, M.K., Thaker, S., and Bernard, S.L.: Development of a Cross-Setting Quality Measure for Pressure Ulcers: OY2 Information Gathering, Final Report. Centers for Medicare & Medicaid Services, November 2013. Available: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Post-Acute-Care-Quality-Initiatives/Downloads/Development-of-a-Cross-Setting-Quality-Measure-for-Pressure-Ulcers-Information-Gathering-Final-Report.pdf

In addition, we are also considering whether body mass index (BMI) should be used as a covariate for risk-adjusting NQF #0678 in the home health setting, as is done in other post-acute care settings. We invite public feedback to inform our direction to include unstageable pressure ulcers and sDTIs in the numerator of the quality measure NQF #0678 Percent of Residents or Patients with Pressure Ulcers that are New or Worsened (Short Stay), as well as on the possible collection of height and weight data for risk-adjustment, as part of our future measure development efforts.

(b) We have also identified four future, cross-setting measure constructs to potentially meet requirements of the IMPACT Act domains of: (1) All-condition risk-adjusted potentially preventable hospital readmission rates; (2) resource use, including total estimated Medicare spending per beneficiary; (3) discharge to community; and (4) medication reconciliation. These are shown in Table 22; we would like to solicit public feedback to inform future measure development of these constructs as it relates to meeting the IMPACT Act requirements in these areas.

Table 22—Future Cross-Setting Measure Constructs Under Consideration To Meet IMPACT Act Requirements

[Home Health Timeline for Implementation—January 1, 2017]

IMPACT Act domain Measures to reflect all-condition risk-adjusted potentially preventable hospital readmission rates
Measures Application of (NQF #2510) : Skilled Nursing Facility 30-Day All-Cause Readmission Measure (SNFRM) CMS is the steward.
Application of the LTCH/IRF All-Cause Unplanned Readmission Measure for 30 Days Post Discharge from LTCHs/IRFs.
IMPACT Act Domain Resource Use, including total estimated Medicare spending per beneficiary.
Measure Payment Standardized Medicare Spending Per Beneficiary (MSPB).
IMPACT Act Domain Discharge to community.
Measure Percentage residents/patients at discharge assessment, who discharged to a higher level of care versus to the community.
IMPACT Act Domain Medication Reconciliation.
Measure Percent of patients for whom any needed medication review actions were completed.

(c) We are working with our measure development and maintenance contractor to identify setting-specific measure concepts for future implementation in the HH QRP that align with or complement current measures and new measures to meet domains specified in the IMPACT Act. In identifying priority areas for future measure enhancement and development, we take into consideration results of environmental scans and resulting gaps analysis for relevant home health quality measure constructs, along with input from numerous stakeholders, including the Measures Application Partnership (MAP), the Medicare Payment Advisory Commission (MedPAC), Technical Expert Panels, and national priorities, such as those established by the National Priorities Partnership, the HHS Strategic Plan, the National Strategy for Quality Improvement in Healthcare, and the CMS Quality Strategy. Based on input from stakeholders, CMS has identified several high priority concept areas for future measure development in Table 23.

Table 23—Future Setting-Specific Measure Constructs Under Consideration

National quality strategy domain Measure construct
Safety Falls risk composite process measure: Percentage of home health patients who were assessed for falls risk and whose care plan reflects the assessment, and which was implemented appropriately.
Effective Prevention and Treatment Nutrition assessment composite measure: Percentage of home health patients who were assessed for nutrition risk with a validated tool and whose care plan reflects the assessment, and which was implemented appropriately.
Improvement in Dyspnea in Patients with a Primary Diagnosis of Congestive Heart Failure (CHF), Chronic Obstructive Pulmonary Disease (COPD), and/or Asthma: Percentage of home health episodes of care during which a patient with a primary diagnosis of CHF, asthma and/or COPD became less short of breath or dyspneic.
Improvement in Patient-Reported Interference due to Pain: Percent of home health patients whose self-reported level of pain interference on the Patient-Reported Objective Measurement Information System (PROMIS) tool improved.
Improvement in Patient-Reported Pain Intensity: Percent of home health patients whose self-reported level of pain severity on the PROMIS tool improved.
Improvement in Patient-Reported Fatigue: Percent of home health patients whose self-reported level of fatigue on the PROMIS tool improved.
Stabilization in 3 or more Activities of Daily Living (ADLs): Percent of home health patients whose functional scores remain the same between admission and discharge for at least 3 ADLs.

These measure concepts are under development, and details regarding measure definitions, data sources, data collection approaches, and timeline for implementation would be communicated in future rulemaking. We invite feedback about these seven high priority concept areas for future measure development.

D. Form, Manner, and Timing of OASIS Data Submission and OASIS Data for Annual Payment Update

1. Regulatory Authority

The HH conditions of participation (CoPs) at § 484.55(d) require that the comprehensive assessment must be updated and revised (including the administration of the OASIS) no less frequently than: (1) The last 5 days of every 60 days beginning with the start of care date, unless there is a beneficiary-elected transfer, significant change in condition, or discharge and return to the same HHA during the 60-day episode; (2) within 48 hours of the patient's return to the home from a hospital admission of 24-hours or more for any reason other than diagnostic tests; and (3) at discharge.

It is important to note that to calculate quality measures from OASIS data, there must be a complete quality episode, which requires both a Start of Care (initial assessment) or Resumption of Care OASIS assessment and a Transfer or Discharge OASIS assessment. Failure to submit sufficient OASIS assessments to allow calculation of quality measures, including transfer and discharge assessments, is a failure to comply with the CoPs.

HHAs do not need to submit OASIS data for those patients who are excluded from the OASIS submission requirements. As described in the December 23, 2005 Medicare and Medicaid Programs: Reporting Outcome and Assessment Information Set Data as Part of the Conditions of Participation for Home Health Agencies final rule (70 FR 76202), we defined the exclusion as those patients:

  • Receiving only non-skilled services;
  • For whom neither Medicare nor Medicaid is paying for HH care (patient receiving care under a Medicare or Medicaid Managed Care Plan are not excluded from the OASIS reporting requirement);
  • Receiving pre- or post-partum services; or
  • Under the age of 18 years.

As set forth in the CY 2008 HH PPS final rule (72 FR 49863), HHAs that become Medicare certified on or after May 31 of the preceding year are not subject to the OASIS quality reporting requirement nor any payment penalty for quality reporting purposes for the following year. For example, HHAs certified on or after May 31, 2014 are not subject to the 2 percentage point reduction to their market basket update for CY 2015. These exclusions only affect quality reporting requirements and do not affect the HHAs' reporting responsibilities as announced in the December 23, 2005 final rule, Medicare and Medicaid Programs; Reporting Outcome and Assessment Information Set Data as Part of the Conditions of Participation for Home Health Agencies (70 FR 76202).

2. Home Health Quality Reporting Program Requirements for CY 2016 Payment and Subsequent Years

In the CY 2014 HH PPS Final rule (78 FR 72297), we finalized a proposal to consider OASIS assessments submitted by HHAs to CMS in compliance with HH CoPs and Conditions for Payment for episodes beginning on or after July 1, 2012, and before July 1, 2013 as fulfilling one portion of the quality reporting requirement for CY 2014.

In addition, we finalized a proposal to continue this pattern for each subsequent year beyond CY 2014. OASIS assessments submitted for episodes beginning on July 1st of the calendar year 2 years prior to the calendar year of the Annual Payment Update (APU) effective date and ending June 30th of the calendar year one year prior to the calendar year of the APU effective date, fulfill the OASIS portion of the HH QRP requirement.

3. Previously Established Pay-for-Reporting Performance Requirement for Submission of OASIS Quality Data

Section 1895(b)(3)(B)(v)(I) of the Act states that for 2007 and each subsequent year, the home health market basket percentage increase applicable under such clause for such year shall be reduced by 2 percentage points if a home health agency does not submit data to the Secretary in accordance with subclause (II) with respect to such a year. This pay-for-reporting requirement was implemented on January 1, 2007. In the CY 2015 HH PPS Final rule (79 FR 38387), we finalized a proposal to define the quantity of OASIS assessments each HHA must submit to meet the pay-for-reporting requirement.

We believe that defining a more explicit performance requirement for the submission of OASIS data by HHAs would better meet section 5201(c)(2) of the Deficit Reduction Act of 2005 (DRA), which requires that each home health agency shall submit to the Secretary such data that the Secretary determines are appropriate for the measurement of health care quality. Such data shall be submitted in a form and manner, and at a time, specified by the Secretary for purposes of this clause.

In the CY 2015 HH PPS Final rule (79 FR 38387), we reported information on a study performed by the Department of Health & Human Services, Office of the Inspector General (OIG) in February 2012 to: (1) Determine the extent to which HHAs met federal reporting requirements for the OASIS data; (2) to determine the extent to which states met federal reporting requirements for OASIS data; and (3) to determine the extent to which CMS was overseeing the accuracy and completeness of OASIS data submitted by HHAs. Based on the OIG report we proposed a performance requirement for submission of OASIS quality data, which would be responsive to the recommendations of the OIG.

In response to these requirements and the OIG report, we designed a pay-for-reporting performance system model that could accurately measure the level of an HHA's submission of OASIS data. The performance system is based on the principle that each HHA is expected to submit a minimum set of two matching assessments for each patient admitted to their agency. These matching assessments together create what is considered a quality episode of care, consisting ideally of a Start of Care (SOC) or Resumption of Care (ROC) assessment and a matching End of Care (EOC) assessment. However, it was determined that there are several scenarios that could meet this matching assessment requirement of the new pay-for-reporting performance requirement. These scenarios or quality assessments are defined as assessments that create a quality episode of care during the reporting period or could create a quality episode if the reporting period were expanded to an earlier reporting period or into the next reporting period.

Seven types of assessments submitted by an HHA fit this definition of a quality assessment. These are:

1. A Start of Care (SOC; M0100 = `01') or Resumption of Care (ROC; M0100 = `03') assessment that can be matched to an End of Care (EOC; M0100 = `06', `07', `08', or `09') assessment. These SOC/ROC assessments are the first assessment in the pair of assessments that create a standard quality of care episode describe in the previous paragraph.

2. An End of Care (EOC) assessment that can be matched to a Start of Care (SOC) or Resumption of Care (ROC) assessment. These EOC assessments are the second assessment in the pair of assessments that create a standard quality of care episode describe in the previous paragraph.

3. A SOC/ROC assessment that could begin an episode of care, but the assessment occurs in the last 60 days of the performance period. This is labeled as a Late SOC/ROC quality assessment. The assumption is that the EOC assessment will occur in the next reporting period.

4. An EOC assessment that could end an episode of care that began in the previous reporting period, (that is, an EOC that occurs in the first 60 days of the performance period). This is labeled as an Early EOC quality assessment. The assumption is that the matching SOC/ROC assessment occurred in the previous reporting period.

5. A SOC/ROC assessment that is followed by one or more follow-up assessments, the last of which occurs in the last 60 days of the performance period. This is labeled as an SOC/ROC Pseudo Episode quality assessment.

6. An EOC assessment is preceded by one or more follow-up assessments, the first of which occurs in the first 60 days of the performance period. This is labeled an EOC Pseudo Episode quality assessment.

7. A SOC/ROC assessment that is part of a known one-visit episode. This is labeled as a One-Visit episode quality assessment. This determination is made by consulting HH claims data.

SOC, ROC, and EOC assessments that do not meet any of these definitions are labeled as Non-Quality assessments. Follow-up assessments (that is, where the M0100 Reason for Assessment = `04' or `05') are considered Neutral assessments and do not count toward or against the pay-for-reporting performance requirement.

Compliance with this performance requirement can be measured through the use of an uncomplicated mathematical formula. This pay-for-reporting performance requirement metric has been titled as the “Quality Assessments Only” (QAO) formula because only those OASIS assessments that contribute, or could contribute, to creating a quality episode of care are included in the computation.

The formula based on this definition is as follows:

Our ultimate goal is to require all HHAs to achieve a pay-for-reporting performance requirement compliance rate of 90 percent or more, as calculated using the QAO metric illustrated above. In the CY 2015 HH PPS final rule (79 FR 66074), we proposed implementing a pay-for-reporting performance requirement over a three-year period. After consideration of the public comments received, we adopted as final our proposal to establish a pay-for-reporting performance requirement for assessments submitted on or after July 1, 2015 and before June 30, 2016 with appropriate start of care dates, HHAs must score at least 70 percent on the QAO metric of pay-for-reporting performance requirement or be subject to a 2 percentage point reduction to their market basket update for CY 2017.

HHAs have been statutorily required to report OASIS for a number of years and therefore should have many years of experience with the collection of OASIS data and transmission of this data to CMS. Given the length of time that HHAs have been mandated to report OASIS data and based on preliminary analyses that indicate that the majority of HHAs are already achieving the target goal of 90 percent on the QAO metric, we believe that HHAs would adapt quickly to the implementation of the pay-for-reporting performance requirement, if phased in over a three-year period.

In the CY2015 rule, we did not finalize a proposal to increase the reporting requirement in 10 percent increments over a two-year period until the maximum rate of 90 percent is reached, but instead proposed to analyze historical data to set the reporting requirements. To set the threshold for the 2nd year, we analyzed the most recently available data, from 2013 and 2014, to make a determination about what the pay-for-reporting performance requirement should be. Specifically, we reviewed OASIS data from this time period simulating the pay-for-reporting performance 70 percent submission requirement to determine the hypothetical performance of each HHA as if the pay-for-reporting performance requirement were in effect during the reporting period preceding its implementation. This analysis indicated a nominal increase of 10 percent each year would provide the greatest opportunity for successful implementation versus an increase of 20 percent from year 1 to year 2.

Based on this analysis, we propose to set the performance threshold at 80 percent for the reporting period from July 1, 2016 through June 30, 2017. For the reporting period from July 1, 2017 through June 30, 2018 and thereafter, we propose the performance threshold would be 90 percent.

We provided a report to each HHA of their hypothetical performance under the pay-for-reporting performance requirement during the 2014-2015 pre-implementation reporting period in June 2015. On January 1, 2015, the data submission process for OASIS converted from the current state-based OASIS submission system to a new national OASIS submission system known as the Assessment Submission and Processing (ASAP) System. On July 1, 2015, when the pay-for-reporting performance requirement of 70 percent goes into effect, providers would be required to submit their OASIS assessment data into the ASAP system. Successful submission of an OASIS assessment would consist of the submission of the data into the ASAP system with a receipt of no fatal error messages. Error messages received during submission can be an indication of a problem that occurred during the submission process and could also be an indication that the OASIS assessment was rejected. Successful submission can be verified by ascertaining that the submitted assessment data resides in the national database after the assessment has met all of the quality standards for completeness and accuracy during the submission process. Should one or more OASIS assessments submitted by a HHA be rejected due to an IT/servers issue caused by CMS, we may, at our discretion, excuse the non-submission of OASIS data. We anticipate that such a scenario would rarely, if ever, occur. In the event that a HHA believes, they were unable to submit OASIS assessments due to an IT/server issue on the part of CMS, the HHA should be prepared to provide any documentation or proof available, which demonstrates that no fault on their part contributed to the failure of the OASIS records to transmit to CMS.

The initial performance period for the pay-for-reporting performance requirement would be July 1, 2015 through June 30, 2016. Prior to and during this performance period, we have scheduled Open Door Forums and webinars to educate HHA personnel as needed about the pay-for-reporting performance requirement program and the pay-for- reporting performance QAO metric, and distributed individual provider preview reports. Additionally, OASIS Education Coordinators (OECs) would be trained to provide state-level instruction on this program and metric. We have already posted a report, which provides a detailed explanation of the methodology for this pay-for-reporting QAO methodology. To view this report, go to: http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment- Instruments/HomeHealthQualityInits/Home-Health-Quality-Reporting-Requirements.html. Training announcements and additional educational information related to the pay-for-reporting performance requirement would be provided on the HH Quality Initiatives Web page. We invite public comment on our proposal to implement an 80 percent Pay-for-Reporting Performance Requirement for Submission of OASIS Quality Data for Year 2 reporting period July 1, 2016 to June 30, 2017 as described previously, for the HH QRP.

E. Home Health Care CAHPS Survey (HHCAHPS)

In the CY 2015 HH PPS final rule (79 FR 66031), we stated that the home health quality measures reporting requirements for Medicare-certified agencies include the Home Health Care CAHPS® (HHCAHPS) Survey for the CY 2015 Annual Payment Update (APU). We maintained the stated HHCAHPS data requirements for CY 2015 set out in previous rules, for the continuous monthly data collection and quarterly data submission of HHCAHPS data.

1. Background and Description of HHCAHPS

As part of the HHS Transparency Initiative, we implemented a process to measure and publicly report patient experiences with home health care, using a survey developed by the Agency for Healthcare Research and Quality's (AHRQ's) Consumer Assessment of Healthcare Providers and Systems (CAHPS®) program and originally endorsed by the NQF in March 2009 (NQF Number 0517) and recently NQF re-endorsed in 2015. The HHCAHPS survey is part of a family of CAHPS® surveys that asks patients to report on and rate their experiences with health care. The HHCAHPS Survey is approved under OMB Control Number 0938-1066 through May 31, 2017. The Home Health Care CAHPS® (HHCAHPS) survey presents home health patients with a set of standardized questions about their home health care providers and about the quality of their home health care.

Prior to the HHCAHPS survey, there was no national standard for collecting information about patient experiences that enabled valid comparisons across all HHAs. The history and development process for HHCAHPS has been described in previous rules and is also available on the official HHCAHPS Web site at https://homehealthcahps.org and in the annually-updated HHCAHPS Protocols and Guidelines Manual, which is downloadable from https://homehealthcahps.org.

For public reporting purposes, we report five measures from the HHCAHPS Survey—three composite measures and two global ratings of care that are derived from the questions on the HHCAHPS survey. The publicly reported data are adjusted for differences in patient mix across HHAs. We update the HHCAHPS data on Home Health Compare on www.medicare.gov quarterly. HHCAHPS data was first publicly reported in April 2012 on Home Health Compare. Each HHCAHPS composite measure consists of four or more individual survey items regarding one of the following related topics:

  • Patient care (Q9, Q16, Q19, and Q24);
  • Communications between providers and patients (Q2, Q15, Q17, Q18, Q22, and Q23); and
  • Specific care issues on medications, home safety, and pain (Q3, Q4, Q5, Q10, Q12, Q13, and Q14).

The two global ratings are the overall rating of care given by the HHA's care providers (Q20), and the patient's willingness to recommend the HHA to family and friends (Q25).

The HHCAHPS survey is currently available in English, Spanish, Chinese, Russian, and Vietnamese. The OMB number on these surveys is the same (0938-1066). All of these surveys are on the Home Health Care CAHPS® Web site, https://homehealthcahps.org. If you need additional language translations of the HHCAHPS Survey, please contact us at HHCAHPS@rti.org.

All of the requirements about home health patient eligibility for the HHCAHPS survey and conversely, which home health patients are ineligible for the HHCAHPS survey are delineated and detailed in the HHCAHPS Protocols and Guidelines Manual, which is downloadable at https://homehealthcahps.org. We update the HHCAHPS Protocols and Guidelines Manual annually, and the current version is 7.0. Home health patients are eligible for HHCAHPS if they received at least two skilled home health visits in the past 2 months, which are paid for by Medicare or Medicaid.

Home health patients are ineligible for inclusion in HHCAHPS surveys if one of these conditions pertains to them:

  • Are under the age of 18;
  • Are deceased prior to the date the sample is pulled;
  • Receive hospice care;
  • Receive routine maternity care only;
  • Are not considered survey eligible because the state in which the patient lives restricts release of patient information for a specific condition or illness that the patient has; or
  • No Publicity patients, defined as patients who on their own initiative at their first encounter with the HHAs make it very clear that no one outside of the agencies can be advised of their patient status, and no one outside of the HHAs can contact them for any reason.

We stated in previous rules that Medicare-certified HHAs are required to contract with an approved HHCAHPS survey vendor. This requirement continues, and Medicare-certified agencies also must provide on a monthly basis a list of all their survey-eligible home health care patients served to their respective HHCAHPS survey vendors. Agencies are not allowed to influence at all how their patients respond to the HHCAHPS survey.

As previously required, HHCAHPS survey vendors are required to attend introductory and all update trainings conducted by CMS and the HHCAHPS Survey Coordination Team, as well as to pass a post-training certification test. Update training is required annually for all approved HHCAHPS survey vendors. We have approximately 30 approved HHCAHPS survey vendors. The most current list of approved HHCAHPS survey vendors is available at https://homehealthcahps.org.

2. HHCAHPS Oversight Activities

We stated in prior final rules that all approved HHCAHPS survey vendors are required to participate in HHCAHPS oversight activities to ensure compliance with HHCAHPS protocols, guidelines, and survey requirements. The purpose of the oversight activities is to ensure that approved HHCAHPS survey vendors follow the HHCAHPS Protocols and Guidelines Manual. As stated previously in the six prior final rules to this proposed rule, all HHCAHPS approved survey vendors must develop a Quality Assurance Plan (QAP) for survey administration in accordance with the HHCAHPS Protocols and Guidelines Manual. An HHCAHPS survey vendor's first QAP must be submitted within 6 weeks of the data submission deadline date after the vendor's first quarterly data submission. The QAP must be updated and submitted annually thereafter and at any time that changes occur in staff or vendor capabilities or systems. A model QAP is included in the HHCAHPS Protocols and Guidelines Manual. The QAP must include the following:

  • Organizational Background and Staff Experience;
  • Work Plan;
  • Sampling Plan;
  • Survey Implementation Plan;
  • Data Security, Confidentiality and Privacy Plan; and
  • Questionnaire Attachments.

As part of the oversight activities, the HHCAHPS Survey Coordination Team conducts on-site visits to all approved HHCAHPS survey vendors. The purpose of the site visits is to allow the HHCAHPS Coordination Team to observe the entire HHCAHPS Survey implementation process, from the sampling stage through file preparation and submission, as well as to assess data security and storage. The HHCAHPS Survey Coordination Team reviews the HHCAHPS survey vendor's survey systems, and assesses administration protocols based on the HHCAHPS Protocols and Guidelines Manual posted at https://homehealthcahps.org. The systems and program site visit review includes, but is not limited to the following:

  • Survey management and data systems;
  • Printing and mailing materials and facilities;
  • Telephone call center facilities;
  • Data receipt, entry and storage facilities; and
  • Written documentation of survey processes.

After the site visits, HHCAHPS survey vendors are given a defined time period in which to correct any identified issues and provide follow-up documentation of corrections for review. HHCAHPS survey vendors are subject to follow-up site visits on an as-needed basis.

In the CY 2013 HH PPS final rule (77 FR 67094, 67164), we codified the current guideline that all approved HHCAHPS survey vendors fully comply with all HHCAHPS oversight activities. We included this survey requirement at § 484.250(c)(3).

3. HHCAHPS Requirements for the CY 2016 APU

In the CY 2015 HH PPS final rule (79 FR 66031), we stated that for the CY 2016 APU, we would require continued monthly HHCAHPS data collection and reporting for four quarters. The data collection period for CY 2016, APU includes the second quarter 2014 through the first quarter 2015 (the months of April 2014 through March 2015). Although these dates are past, we wished to state them in this proposed rule so that HHAs are again reminded of what months constituted the requirements for the CY 2016 APU. HHAs are required to submit their HHCAHPS data files to the HHCAHPS Data Center for the HHCAHPS data from the first quarter of 2015 data by 11:59 p.m., EST on July 16, 2015. This deadline is firm; no exceptions are permitted.

For the CY 2016 APU, we required that all HHAs that had fewer than 60 HHCAHPS-eligible unduplicated or unique patients in the period of April 1, 2013 through March 31, 2014 are exempted from the HHCAHPS data collection and submission requirements for the CY 2016 APU, upon completion of the CY 2016 HHCAHPS Participation Exemption Request form, and upon CMS verification of the HHA patient counts. Agencies with fewer than 60 HHCAHPS-eligible, unduplicated or unique patients in the period of April 1, 2013, through March 31, 2014, were required to submit their patient counts on the HHCAHPS Participation Exemption Request form for the CY 2016 APU posted on https://homehealthcahps.org by 11:59 p.m., EST on March 31, 2015. This deadline was firm, as are all of the quarterly data submission deadlines for the HHAs that participate in HHCAHPS.

We automatically exempt HHAs receiving Medicare certification after the period in which HHAs do their patient counts. HHAs receiving Medicare certification on or after April 1, 2014 are exempt from the HHCAHPS reporting requirement for the CY 2016 APU. These newly-certified HHAs did not need to complete a HHCAHPS Participation Exemption Request form for the CY 2016 APU.

4. HHCAHPS Requirements for the CY 2017 APU

For the CY 2017 APU, we require continued monthly HHCAHPS data collection and reporting for four quarters. The data collection period for the CY 2017, APU includes the second quarter 2015 through the first quarter 2016 (the months of April 2015 through March 2016). HHAs would be required to submit their HHCAHPS data files to the HHCAHPS Data Center for the second quarter 2015 by 11:59 p.m., EST on October 15, 2015; for the third quarter 2015 by 11:59 p.m., EST on January 21, 2016; for the fourth quarter 2015 by 11:59 p.m., EST on April 21, 2016; and for the first quarter 2016 by 11:59 p.m., EST on July 21, 2016. These deadlines will be firm; no exceptions will be permitted.

For the CY 2017 APU, we require that all HHAs that have fewer than 60 HHCAHPS-eligible unduplicated or unique patients in the period of April 1, 2014 through March 31, 2015 are exempted from the HHCAHPS data collection and submission requirements for the CY 2017 APU, upon completion of the CY 2017 HHCAHPS Participation Exemption Request form, and upon CMS verification of the HHA patient counts. Agencies with fewer than 60 HHCAHPS-eligible, unduplicated or unique patients in the period of April 1, 2014 through March 31, 2015, are required to submit their patient counts on the HHCAHPS Participation Exemption Request form for the CY 2017 APU posted on https://homehealthcahps.org by 11:59 p.m., EST on March 31, 2016. This deadline is firm, as are all of the quarterly data submission deadlines for the HHAs that participate in HHCAHPS.

We automatically exempt HHAs receiving Medicare certification after the period in which HHAs do their patient counts. HHAs receiving Medicare certification on or after April 1, 2015 are exempt from the HHCAHPS reporting requirement for the CY 2017 APU. These newly-certified HHAs did not need to complete a HHCAHPS Participation Exemption Request form for the CY 2017 APU.

5. HHCAHPS Requirements for the CY 2018 APU

For the CY 2018 APU, we require continued monthly HHCAHPS data collection and reporting for four quarters. The data collection period for the CY 2018, APU includes the second quarter 2016 through the first quarter 2017 (the months of April 2016 through March 2017). HHAs would be required to submit their HHCAHPS data files to the HHCAHPS Data Center for the second quarter 2016 by 11:59 p.m., EST on October 20, 2016; for the third quarter 2016 by 11:59 p.m., EST on January 19, 2017; for the fourth quarter 2016 by 11:59 p.m., EST on April 20, 2017; and for the first quarter 2017 by 11:59 p.m., EST on July 20, 2017. These deadlines will be firm; no exceptions will be permitted.

For the CY 2018 APU, we require that all HHAs that have fewer than 60 HHCAHPS-eligible unduplicated or unique patients in the period of April 1, 2015 through March 31, 2016 are exempted from the HHCAHPS data collection and submission requirements for the CY 2018 APU, upon completion of the CY 2018 HHCAHPS Participation Exemption Request form, and upon CMS verification of the HHA patient counts. Agencies with fewer than 60 HHCAHPS-eligible, unduplicated or unique patients in the period of April 1, 2015 through March 31, 2016, are required to submit their patient counts on the HHCAHPS Participation Exemption Request form for the CY 2018 APU posted on https://homehealthcahps.org by 11:59 p.m., EST on March 31, 2017. This deadline is firm, as are all of the quarterly data submission deadlines for the HHAs that participate in HHCAHPS.

We automatically exempt HHAs receiving Medicare certification after the period in which HHAs do their patient counts. HHAs receiving Medicare Certification on or after April 1, 2016 are exempt from the HHCAHPS reporting requirement for the CY 2018 APU. These newly-certified HHAs did not need to complete a HHCAHPS Participation Exemption Request form for the CY 2018 APU.

6. HHCAHPS Reconsiderations and Appeals Process

HHAs should monitor their respective HHCAHPS survey vendors to ensure that vendors submit their HHCAHPS data on time, by accessing their HHCAHPS Data Submission Reports on https://homehealthcahps.org. This would help HHAs ensure that their data are submitted in the proper format for data processing to the HHCAHPS Data Center.

We will continue HHCAHPS oversight activities as finalized in the CY 2014 rule. In the CY 2013 HH PPS final rule (77 FR 6704, 67164), we codified the current guideline that all approved HHCAHPS survey vendors must fully comply with all HHCAHPS oversight activities. We included this survey requirement at § 484.250(c)(3).

We propose to continue the OASIS and HHCAHPS reconsiderations and appeals process that we have finalized and that we have used for prior periods for the CY 2012, CY 2013, CY 2014, and CY 2015 APU determinations. We have described the reconsiderations process requirements in the CMS Technical Direction Letter that we sent to the affected HHAs, on or in late September. HHAs have 30 days from their receipt of the Technical Direction Letter informing them that they did not meet the OASIS and HHCAHPS requirements for the CY period, to send all documentation that supports their requests for reconsideration to CMS. It is important that the affected HHAs send in comprehensive information in their reconsideration letter/package because we would not contact the affected HHAs to request additional information or to clarify incomplete or inconclusive information. If clear evidence to support a finding of compliance is not present, the 2 percent reduction in the APU would be upheld. If clear evidence of compliance is present, the 2 percent reduction for the APU would be reversed. We notify affected HHAs by December 31st annually for the APU period that begins on January 1st. If we determine to uphold the 2 percent reduction, the HHA may further appeal the 2 percent reduction via the Provider Reimbursement Review Board (PRRB) appeals process. The PRRB contact information is provided to the HHAs receiving letters in December about the CMS reconsideration decisions.

Providers who wish to submit a reconsideration request should continue to follow the reconsideration and appeals process as finalized in the CY 2012, CY 2013, CY 2014, and CY 2015 Home Health Prospective Payment System Rate Update Final Rules.

7. Summary

We are not proposing any changes to the participation requirements, or to the requirements pertaining to the implementation of the Home Health CAHPS® Survey (HHCAHPS). We only updated the information to reflect the dates in the future APU years. We again strongly encourage HHAs to keep up-to-date about the HHCAHPS by regularly viewing the official Web site for the HHCAHPS at https://homehealthcahps.org. HHAs can also send an email to the HHCAHPS Survey Coordination Team at HHCAHPS@rti.org, or telephone toll-free (1-866-354-0985) for more information about HHCAHPS.

F. Public Display of Home Health Quality Data for the HH QRP

Section 1895(b)(3)(B)(v)(III) of the Act and section 1899B(f) of the IMPACT Act states the Secretary shall establish procedures for making data submitted under subclause (II) available to the public. Such procedures shall ensure that a home health agency has the opportunity to review the data that is to be made public with respect to the agency prior to such data being made public. We recognize that public reporting of quality data is a vital component of a robust quality reporting program and are fully committed to ensuring that the data made available to the public be meaningful and that comparing performance across home health agencies requires that measures be constructed from data collected in a standardized and uniform manner. We also recognize the need to ensure that each home health agency has the opportunity to review the data before publication. Medicare home health regulations, as codified at § 484.250(a), requires HHAs to submit OASIS assessments and Home Health Care Consumer Assessment of Healthcare Providers and Systems Survey® (HHCAHPS) data to meet the quality reporting requirements of section 1895(b)(3)(B)(v) of the Act.

In addition, beginning April 1, 2015 HHAs began to receive Provider Preview Reports (for all Process Measures and Outcome Measures) on a quarterly, rather than annual, basis. The opportunity for providers to review their data and to submit corrections prior to public reporting aligns with the other quality reporting programs and the requirement for provider review under the IMPACT Act. We provide quality measure data to HHAs via the Certification and Survey Provider Enhanced Reports (CASPER reports), which are available through the CMS Health Care Quality Improvement and Evaluation System (QIES).

As part of our ongoing efforts to make healthcare more transparent, affordable, and accountable, the HH QRP has developed a CMS Compare Web site for home health agencies, which identifies home health providers based on the areas they serve. Consumers can search for all Medicare-certified home health providers that serve their city or ZIP code and then find the agencies offering the types of services they need. A subset of the HH quality measures has been publicly reported on the Home Health Compare (HH Compare) Web site since 2003. The selected measures that are made available to the public can be viewed on the HH Compare Web site located at http://www.medicare.gov/HHCompare/Home.asp.

The Affordable Care Act calls for transparent, easily understood information on provider quality to be publicly reported and made widely available. To provide home health care consumers with a summary of existing quality measures in an accessible format, we plan to publish a star rating based on the quality of care measures for home health agencies on Home Health Compare starting in July 2015. This is part of our plan to adopt star ratings across all Medicare.gov Compare Web sites. Star ratings are currently publicly displayed on Nursing Home Compare, Physician Compare, the Medicare Advantage Plan Finder, and Dialysis Facility Compare, and they are scheduled to be displayed on Hospital Compare in 2015.

The Quality of Patient Care star rating methodology assigns each home health agency a rating between one (1) and five (5) stars, using half stars for adjustment and reporting. All Medicare-certified home health agencies are eligible to receive a Quality of Patient Care star rating providing that they have quality data reported on at least 5 out of the 9 quality measures that are included in the calculation.

Home health agencies would continue to have prepublication access to their agency's quality data, which enables each agency to know how it is performing before public posting of the data on the Compare Web site. Starting in April 2015, HHAs are receiving quarterly preview reports showing their Quality of Patient Care star rating and how it was derived well before public posting, and they have several weeks to review and provide feedback.

The Quality of Patient Care star ratings methodology was developed through a transparent process the included multiple opportunities for stakeholder input, which was subsequently the basis for refinements to the methodology. An initial proposed methodology for calculating the Quality of Patient Care star ratings was posted on the CMS.gov Web site in December 2014. CMS then held two Special Open Door Forums (SODFs) on December 17, 2014 and February 5, 2015 to present the proposed methodology and solicit input. At each SODF, stakeholders provided immediate input, and were invited to submit additional comments via the Quality of Patient Care star ratings Help Desk mailbox: HHC_Star_Ratings_Helpdesk@cms.hhs.gov. CMS refined the methodology, based on comments received and additional analysis. The final methodology report is posted on the new star ratings Web page : http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIHomeHealthStarRatings.html. A Frequently-Asked-Questions (FAQ) document is also posted on the same Web page, addressing the issues raised in the comments that were received. We tested the Web site language used to present the Quality of Patient Care star ratings with Medicare beneficiaries to assure that it allowed them to accurately understand the significance of the various star ratings.

Additional information regarding the Quality of Patient Care star rating would be posted on the star ratings Web page at http://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/HomeHealthQualityInits/HHQIHomeHealthStarRatings.html. Additional communications regarding the Quality of Patient Care star ratings would be announced via regular HH QRP communication channels.

VI. Collection of Information Requirements

While this proposed rule contains information collection requirements, this rule does not add new, nor revise any of the existing information collection requirements, or burden estimate. The information collection requirements discussed in this rule for the OASIS-C1 data item set had been previously approved by the Office of Management and Budget (OMB) on February 6, 2014 and scheduled for implementation on October 1, 2014. The extension of OASIS-C1/ICD-9 version was reapproved under OMB control number 0938-0760 with a current expiration date of March 31, 2018. This version of the OASIS will be discontinued once the OASIS-C1/ICD-10 version is approved and implemented. In addition, to facilitate the reporting of OASIS data as it relates to the implementation of ICD-10 on October 1, 2015, CMS submitted a new request for approval to OMB for the OASIS-C1/ICD-10 version under the Paperwork Reduction Act (PRA) process. CMS is requesting a new OMB control number for the proposed revised OASIS item as announced in the 30-day Federal Register notice (80 FR 15797). The new information collection request is currently pending OMB approval.

VII. Response to Comments

Because of the large number of public comments we normally receive on Federal Register documents, we are not able to acknowledge or respond to them individually. We will consider all comments we receive by the date and time specified in the DATES section of this preamble, and, when we proceed with a subsequent document, we will respond to the comments in the preamble to that document.

VIII. Regulatory Impact Analysis

A. Statement of Need

Section 1895(b)(1) of the Act requires the Secretary to establish a HH PPS for all costs of HH services paid under Medicare. In addition, section 1895(b)(3)(A) of the Act requires (1) the computation of a standard prospective payment amount include all costs for HH services covered and paid for on a reasonable cost basis and that such amounts be initially based on the most recent audited cost report data available to the Secretary, and (2) the standardized prospective payment amount be adjusted to account for the effects of case-mix and wage levels among HHAs. Section 1895(b)(3)(B) of the Act addresses the annual update to the standard prospective payment amounts by the HH applicable percentage increase. Section 1895(b)(4) of the Act governs the payment computation. Sections 1895(b)(4)(A)(i) and (b)(4)(A)(ii) of the Act require the standard prospective payment amount to be adjusted for case-mix and geographic differences in wage levels. Section 1895(b)(4)(B) of the Act requires the establishment of appropriate case-mix adjustment factors for significant variation in costs among different units of services. Lastly, section 1895(b)(4)(C) of the Act requires the establishment of wage adjustment factors that reflect the relative level of wages, and wage-related costs applicable to HH services furnished in a geographic area compared to the applicable national average level.

Section 1895(b)(3)(B)(iv) of the Act provides the Secretary with the authority to implement adjustments to the standard prospective payment amount (or amounts) for subsequent years to eliminate the effect of changes in aggregate payments during a previous year or years that was the result of changes in the coding or classification of different units of services that do not reflect real changes in case-mix. Section 1895(b)(5) of the Act provides the Secretary with the option to make changes to the payment amount otherwise paid in the case of outliers because of unusual variations in the type or amount of medically necessary care. Section 1895(b)(3)(B)(v) of the Act requires HHAs to submit data for purposes of measuring health care quality, and links the quality data submission to the annual applicable percentage increase.

Section 421(a) of the MMA requires that HH services furnished in a rural area, for episodes and visits ending on or after April 1, 2010, and before January 1, 2016, receive an increase of 3 percent of the payment amount otherwise made under section 1895 of the Act. Section 210 of the MACRA amended section 421(a) of the MMA to extend the 3 percent increase to the payment amounts for serviced furnished in rural areas for episodes and visits ending before January 1, 2018.

Section 3131(a) of the Affordable Care Act mandates that starting in CY 2014, the Secretary must apply an adjustment to the national, standardized 60-day episode payment rate and other amounts applicable under section 1895(b)(3)(A)(i)(III) of the Act to reflect factors such as changes in the number of visits in an episode, the mix of services in an episode, the level of intensity of services in an episode, the average cost of providing care per episode, and other relevant factors. In addition, section 3131(a) of the Affordable Care Act mandates that rebasing must be phased-in over a 4-year period in equal increments, not to exceed 3.5 percent of the amount (or amounts) as of the date of enactment (2010) under section 1895(b)(3)(A)(i)(III) of the Act, and be fully implemented in CY 2017.

The proposed HHVBP model would apply a payment adjustment based on an HHA's performance on quality measures to test the effects on quality and costs of care. This proposed HHVBP model was developed based on the experiences we gained from the implementation of the Home Health Pay-for-Performance (HHPP) demonstration as well as the successful implementation of the HVBP program. The model design was also developed from the public comments received on the discussion of a HHVBP model being considered in the CY 2015 HH PPS proposed and final rules. Value-based purchasing programs have also been included in the President's budget for most providers types, including Home Health.

B. Overall Impact

We have examined the impacts of this rule as required by Executive Order 12866 on Regulatory Planning and Review (September 30, 1993), Executive Order 13563 on Improving Regulation and Regulatory Review (January 18, 2011), the Regulatory Flexibility Act (RFA) (September 19, 1980, Pub. L. 96-354), section 1102(b) of the Act, section 202 of the Unfunded Mandates Reform Act of 1995 (UMRA, March 22, 1995; Pub. L. 104-4), Executive Order 13132 on Federalism (August 4, 1999), and the Congressional Review Act (5 U.S.C. 804(2)).

Executive Orders 12866 and 13563 direct agencies to assess all costs and benefits of available regulatory alternatives and, if regulation is necessary, to select regulatory approaches that maximize net benefits (including potential economic, environmental, public health and safety effects, distributive impacts, and equity). Executive Order 13563 emphasizes the importance of quantifying both costs and benefits, of reducing costs, of harmonizing rules, and of promoting flexibility. The net transfer impacts related to the proposed changes in payments under the HH PPS for CY 2016 are estimated to be −$350 million. The savings impacts related to the proposed HHVBP model are estimated at a total projected 5-year gross savings of $380 million assuming a very conservative savings estimate of a 6 percent annual reduction in hospitalizations and a 1.0 percent annual reduction in SNF admissions. In accordance with the provisions of Executive Order 12866, this regulation was reviewed by the Office of Management and Budget.

1. HH PPS

The update set forth in this rule applies to Medicare payments under HH PPS in CY 2016. Accordingly, the following analysis describes the impact in CY 2016 only. We estimate that the net impact of the proposals in this rule is approximately $350 million in decreased payments to HHAs in CY 2016. We applied a wage index budget neutrality factor and a case-mix weights budget neutrality factor to the rates as discussed in section III.C.3 of this proposed rule; therefore, the estimated impact of the 2016 wage index proposed in section III.C.3 of this proposed rule and the recalibration of the case-mix weights for 2016 proposed in section III.B. of this proposed rule is zero. The −$350 million impact reflects the distributional effects of the 2.3 percent HH payment update percentage ($420 million increase), the effects of the third year of the four-year phase-in of the rebasing adjustments to the national, standardized 60-day episode payment amount, the national per-visit payment rates, and the NRS conversion factor for an impact of −2.5 percent ($470 million decrease), and the effects of the −1.72 percent adjustment for nominal case-mix growth ($300 million decrease). The $350 million in decreased payments is reflected in the last column of the first row in Table 24 as a 0.1 percent decrease in expenditures when comparing CY 2015 payments to estimated CY 2016 payments.

The RFA requires agencies to analyze options for regulatory relief of small entities, if a rule has a significant impact on a substantial number of small entities. For purposes of the RFA, small entities include small businesses, nonprofit organizations, and small governmental jurisdictions. Most hospitals and most other providers and suppliers are small entities, either by nonprofit status or by having revenues of less than $7.5 million to $38.5 million in any one year. For the purposes of the RFA, we estimate that almost all HHAs are small entities as that term is used in the RFA. Individuals and states are not included in the definition of a small entity. The economic impact assessment is based on estimated Medicare payments (revenues) and HHS's practice in interpreting the RFA is to consider effects economically “significant” only if greater than 5 percent of providers reach a threshold of 3 to 5 percent or more of total revenue or total costs. The majority of HHAs' visits are Medicare-paid visits and therefore the majority of HHAs' revenue consists of Medicare payments. Based on our analysis, we conclude that the policies proposed in this rule will result in an estimated total impact of 3 to 5 percent or more on Medicare revenue for greater than 5 percent of HHAs. Therefore, the Secretary has determined that this HH PPS proposed rule will have a significant economic impact on a substantial number of small entities. Further detail is presented in Table 24, by HHA type and location.

With regards to options for regulatory relief, we note that in the CY 2014 HH PPS final rule we finalized rebasing adjustments to the national, standardized 60-day episode rate, non-routine supplies (NRS) conversion factor, and the national per-visit payment rates for each year, 2014 through 2017 as described in section II.C and III.C.3 of this proposed rule. Since the rebasing adjustments are mandated by section 3131(a) of the Affordable Care Act, we cannot offer HHAs relief from the rebasing adjustments for CY 2016. For the proposed reduction to the national, standardized 60-day episode payment amount of 1.72 percent for CY 2016 described in section III.B.2 of this proposed rule, we believe it is appropriate to reduce the national, standardized 60-day episode payment amount to account for the estimated increase in nominal case-mix in order to move towards more accurate payment for the delivery of home health services where payments better align with the costs of providing such services. In the alternatives considered section below, we note that we considered proposing the full 3.41 percent reduction to the 60-day episode rate in CY 2016 to account for nominal case-mix growth between CY 2012 and CY 2014. However, we instead proposed to reduce the 60-day episode rate by 1.72 percent in CY 2016 and 1.72 percent in CY 2017 to account for estimated nominal case-mix growth between CY 2012 and CY 2014.

Executive Order 13563 specifies, to the extent practicable, agencies should assess the costs of cumulative regulations. However, given potential utilization pattern changes, wage index changes, changes to the market basket forecasts, and unknowns regarding future policy changes, we believe it is neither practicable nor appropriate to forecast the cumulative impact of the rebasing adjustments on Medicare payments to HHAs for future years at this time. Changes to the Medicare program may continue to be made as a result of the Affordable Care Act, or new statutory provisions. Although these changes may not be specific to the HH PPS, the nature of the Medicare program is such that the changes may interact, and the complexity of the interaction of these changes would make it difficult to predict accurately the full scope of the impact upon HHAs for future years beyond CY 2016. We note that the rebasing adjustments to the national, standardized 60-day episode payment rate and the national per-visit rates are capped at the statutory limit of 3.5 percent of the CY 2010 amounts (as described in the preamble in section II.C. of this proposed rule) for each year, 2014 through 2017. The NRS rebasing adjustment will be −2.82 percent in each year, 2014 through 2017.

In addition, section 1102(b) of the Act requires us to prepare a RIA if a rule may have a significant impact on the operations of a substantial number of small rural hospitals. This analysis must conform to the provisions of section 603 of RFA. For purposes of section 1102(b) of the Act, we define a small rural hospital as a hospital that is located outside of a metropolitan statistical area and has fewer than 100 beds. This proposed rule applies to HHAs. Therefore, the Secretary has determined that the HH PPS proposed rule will not have a significant economic impact on the operations of small rural hospitals.

2. Proposed HHVBP Model

To test the impact of upside and downside value-based payment adjustments, beginning in calendar year 2018 and in each succeeding calendar year through calendar year 2022, the proposed model would adjust the final claim payment amount for a home health agency for each episode in a calendar year by an amount equal to the applicable percent. For purposes of this proposed rule, we have limited our analysis of the economic impacts to the value-based incentive payment adjustments. Under the proposed model design, the incentive payment adjustments would be limited to the total payment reductions to home health agencies included in the model and would be no less than the total amount available for value-based incentive payment adjustment. Overall, the distributive impact of this proposed rule is estimated at $380 million for CY 2018-2022. Therefore, this proposed rule is economically significant and thus a major rule under the Congressional Review Act. The proposed model would test the effect on quality and costs of care by applying payment adjustments based on HHAs' performance on quality measures. This proposed rule was developed based on extensive research and experience with value-based purchasing models.

Guidance issued by the Department of Health and Human Services interpreting the Regulatory Flexibility Act considers the effects economically ‘significant’ only if greater than 5 percent of providers reach a threshold of 3 to 5 percent or more of total revenue or total costs. Among the over 1900 HHAs in the selected states that would be expected to be included in the proposed HHVBP model, we estimate that the maximum percent payment adjustment resulting from this proposed rule will only be greater than −5 percent for 10 percent of the HHAs included in the model (using the 8 percent maximum payment adjustment threshold applied in CY2021 and CY2022). As a result, only 2 percent of all HHA providers nationally would be significantly impacted, falling well below the RFA threshold. In addition, only HHAs that are impacted with lower payments are those providers that provide the poorest quality which is the main tenet of the model. This falls well below the threshold for economic significance established by HHS for requiring a more detailed impact assessment under the RFA. Thus, we are not preparing an analysis under the RFA because the Secretary has determined that this proposed rule would not have a significant economic impact on a substantial number of small entities.

In addition, section 1102(b) of the Act requires us to prepare a regulatory impact analysis if a rule may have a significant impact on the operations of a substantial number of small rural HHAs. This analysis must conform to the provisions of section 603 of the RFA. For purposes of section 1102(b) of the Act, we have identified less than 5 percent of HHAs included in the proposed selected states that primarily serve beneficiaries that reside in rural areas (greater than 50 percent of beneficiaries served). We are not preparing an analysis under section 1102(b) of the Act because the Secretary has determined that the proposed HHVBP model would not have a significant impact on the operations of a substantial number of small rural HHAs.

Section 202 of the Unfunded Mandates Reform Act of 1995 also requires that agencies assess anticipated costs and benefits before issuing any rule whose mandates require spending in any 1 year of $100 million in 1995 dollars, updated annually for inflation. In 2015, that threshold is approximately $144 million. This rule will have no consequential effect on state, local, or tribal governments or on the private sector.

Executive Order 13132 establishes certain requirements that an agency must meet when it promulgates a proposed rule (and subsequent final rule) that imposes substantial direct requirement costs on State and local governments, preempts state law, or otherwise has Federalism implications. Since this regulation does not impose any costs on state or local governments, the requirements of Executive Order 13132 are not applicable.

In accordance with the provisions of Executive Order 12866, this regulation was reviewed by the Office of Management and Budget.

C. Detailed Economic Analysis

1. HH PPS

This proposed rule sets forth updates for CY 2016 to the HH PPS rates contained in the CY 2015 HH PPS final rule (79 FR 66032 through 66118). The impact analysis of this proposed rule presents the estimated expenditure effects of policy changes proposed in this rule. We use the latest data and best analysis available, but we do not make adjustments for future changes in such variables as number of visits or case-mix.

This analysis incorporates the latest estimates of growth in service use and payments under the Medicare HH benefit, based primarily on preliminary Medicare claims data from 2014. We note that certain events may combine to limit the scope or accuracy of our impact analysis, because such an analysis is future-oriented and, thus, susceptible to errors resulting from other changes in the impact time period assessed. Some examples of such possible events are newly-legislated general Medicare program funding changes made by the Congress, or changes specifically related to HHAs. In addition, changes to the Medicare program may continue to be made as a result of the Affordable Care Act, or new statutory provisions. Although these changes may not be specific to the HH PPS, the nature of the Medicare program is such that the changes may interact, and the complexity of the interaction of these changes could make it difficult to predict accurately the full scope of the impact upon HHAs.

Table 24 represents how HHA revenues are likely to be affected by the policy changes proposed in this rule. For this analysis, we used an analytic file with linked CY 2014 HH claims data (as of December 31, 2014) for dates of service that ended on or before December 31, 2014, and OASIS assessments. The first column of Table 24 classifies HHAs according to a number of characteristics including provider type, geographic region, and urban and rural locations. The second column shows the number of facilities in the impact analysis. The third column shows the payment effects of proposed CY 2016 wage index. The fourth column shows the payment effects of the proposed CY 2016 case-mix weights. The fifth column shows the effects the proposed reduction of 1.72 percent to the national, standardized 60-day episode payment amount to account for nominal case-mix growth. The sixth column shows the effects of the rebasing adjustments to the national, standardized 60-day episode payment rate, the national per-visit payment rates, and NRS conversion factor. For CY 2016, the average impact for all HHAs due to the effects of rebasing is an estimated 2.5 percent decrease in payments. The seventh column shows the effects of the CY 2016 home health payment update percentage (the home health market basket update adjusted for multifactor productivity as discussed in section III.C.1. of this proposed rule).

The last column shows the combined effects of all the proposed policies for HH PPS. Overall, it is projected that aggregate payments in CY 2016 will decrease by 1.8 percent. As illustrated in Table 24, the combined effects of all of the changes vary by specific types of providers and by location. We note that some individual HHAs within the same group may experience different impacts on payments than others due to the distributional impact of the CY 2016 wage index, the extent to which HHAs had episodes in case-mix groups where the case-mix weight decreased for CY 2016 relative to CY 2015, the percentage of total HH PPS payments that were subject to the low-utilization payment adjustment (LUPA) or paid as outlier payments, and the degree of Medicare utilization.

TABLE 24—Estimated Home Health Agency Impacts by Facility Type and Area of the Country, CY 2016

Number of agencies CY 2016 wage index (percent) CY 2016 case-mix weights (percent) 60-day episode rate nominal case-mix reduction (percent) Rebasing (percent) HH payment update percentage (percent) Total (percent)
All Agencies 11,432 0.0 0.0 −1.6 −2.5 2.3 −1.8
Facility Type and Control
Free-Standing/Other Vol/NP 1,054 0.2 −0.2 −1.6 −2.5 2.3 −1.8
Free-Standing/Other Proprietary 8,917 0.0 0.0 −1.6 −2.5 2.3 −1.8
Free-Standing/Other Government 379 −0.2 −0.1 −1.6 −2.5 2.3 −2.1
Facility-Based Vol/NP 741 0.1 −0.2 −1.6 −2.5 2.3 −1.9
Facility-Based Proprietary 116 −0.3 −0.1 −1.6 −2.5 2.3 −2.2
Facility-Based Government 225 −0.2 −0.2 −1.6 −2.5 2.3 −2.2
Subtotal: Freestanding 10,350 0.0 0.0 −1.6 −2.5 2.3 −1.8
Subtotal: Facility-based 1,082 0.0 −0.2 −1.6 −2.5 2.3 −2.0
Subtotal: Vol/NP 1,795 0.1 −0.2 −1.6 −2.5 2.3 −1.9
Subtotal: Proprietary 9,033 0.0 0.0 −1.6 −2.5 2.3 −1.8
Subtotal: Government 604 −0.2 −0.1 −1.6 −2.5 2.3 −2.1
Facility Type and Control: Rural
Free-Standing/Other Vol/NP 188 −0.8 −0.2 −1.6 −2.4 2.3 −2.7
Free-Standing/Other Proprietary 143 −0.2 −0.1 −1.6 −2.5 2.3 −2.1
Free-Standing/Other Government 448 −0.5 −0.1 −1.6 −2.5 2.3 −2.4
Facility-Based Vol/NP 231 −0.6 −0.2 −1.6 −2.5 2.3 −2.6
Facility-Based Proprietary 25 0.0 −0.2 −1.6 −2.5 2.3 −2.0
Facility-Based Government 136 −0.4 −0.1 −1.6 −2.5 2.3 −2.3
Facility Type and Control: Urban
Free-Standing/Other Vol/NP 912 0.2 −0.2 −1.6 −2.5 2.3 −1.8
Free-Standing/Other Proprietary 8,604 0.0 0.0 −1.6 −2.5 2.3 −1.8
Free-Standing/Other Government 152 −0.4 −0.1 −1.6 −2.5 2.3 −2.3
Facility-Based Vol/NP 510 0.2 −0.2 −1.6 −2.5 2.3 −1.8
Facility-Based Proprietary 91 −0.3 −0.1 −1.6 −2.4 2.3 −2.1
Facility-Based Government 89 −0.1 −0.2 −1.6 −2.5 2.3 −2.1
Facility Location: Urban or Rural
Rural 1,074 −0.5 −0.1 −1.6 −2.5 2.3 −2.4
Urban 10,358 0.1 0.0 −1.6 −2.5 2.3 −1.7
Facility Location: Region of the Country
Northeast 837 0.2 −0.1 −1.6 −2.4 2.3 2.3
Midwest 3,044 −0.1 0.0 −1.6 −2.5 2.3 −1.9
South 5,623 −0.1 0.0 −1.6 −2.5 2.3 −1.9
West 1,837 0.4 −0.1 −1.6 −2.5 2.3 −1.5
Other 91 0.4 0.1 −1.6 −2.5 2.3 −1.3
Facility Location: Region of the Country (Census Region)
New England 296 0.2 −0.1 −1.6 −2.4 2.3 2.3
Mid Atlantic 541 0.3 −0.1 −1.6 −2.5 2.3 −1.6
East North Central 2,407 −0.1 0.0 −1.6 −2.6 2.3 −2.0
West North Central 637 0.0 0.0 −1.6 −2.5 2.3 −1.8
South Atlantic 1,826 0.2 0.1 −1.6 −2.5 2.3 −1.5
East South Central 444 −0.4 0.0 −1.6 −2.6 2.3 −2.3
West South Central 3,353 −0.2 −0.1 −1.6 −2.5 2.3 −2.1
Mountain 602 0.2 0.0 −1.6 −2.5 2.3 −1.6
Pacific 1,235 0.5 −0.2 −1.6 −2.5 2.3 −1.5
Facility Size (Number of 1st Episodes)
< 100 episodes 3,171 0.1 −0.1 −1.6 −2.5 2.3 2.3
100 to 249 2,861 0.1 0.0 −1.6 −2.5 2.3 −1.7
250 to 499 2,425 0.1 0.0 −1.6 −2.5 2.3 −1.7
500 to 999 1,679 0.0 0.0 −1.6 −2.5 2.3 −1.8
1,000 or More 1,296 0.0 −0.1 −1.6 −2.5 2.3 −1.9
Source: CY 2014 Medicare claims data for episodes ending on or before December 31, 2014 (as of December 31, 2014) for which we had a linked OASIS assessment.
The impact of the proposed CY 2016 home health wage index is offset by the wage index budget neutrality factor described in section III.C.3 of this proposed rule.
The impact of the proposed CY 2016 home health case-mix weights reflects the recalibration of the case-mix weights as outlined in section III.B.1 of this proposed rule offset by the case-mix weights budget neutrality factor described in section III.C.3 of this proposed rule.
The impact of rebasing includes the rebasing adjustments to the national, standardized 60-day episode payment rate (-2.74 percent after the CY 2016 payment rate was adjusted for the wage index and case-mix weight budget neutrality factors and the nominal case-mix reduction), the national per-visit rates (+2.9 percent), and the NRS conversion factor (-2.82 percent). The estimated impact of the NRS conversion factor rebasing adjustment is an overall -0.01 percent decrease in estimated payments to HHAs
The CY 2016 home health payment update percentage reflects the home health market basket update of 2.9 percent, reduced by a 0.6 percentage point multifactor productivity (MFP) adjustment as required under section 1895(b)(3)(B)(vi)(I) of the Act, as described in section III.C.1 of this proposed rule.
Region Key:
New England=Connecticut, Maine, Massachusetts, New Hampshire, Rhode Island, Vermont;
Middle Atlantic=Pennsylvania, New Jersey, New York; South Atlantic=Delaware, District of Columbia, Florida, Georgia, Maryland, North Carolina, South Carolina, Virginia, West Virginia; East North Central=Illinois, Indiana, Michigan, Ohio, Wisconsin; East South Central=Alabama, Kentucky, Mississippi, Tennessee; West North Central=Iowa, Kansas, Minnesota, Missouri, Nebraska, North Dakota, South Dakota; West South Central=Arkansas, Louisiana, Oklahoma, Texas; Mountain=Arizona, Colorado, Idaho, Montana, Nevada, New Mexico, Utah, Wyoming; Pacific=Alaska, California, Hawaii, Oregon, Washington;
Other=Guam, Puerto Rico, Virgin Islands

2. Proposed HHVBP Model

Table 25 displays our analysis of the distribution of possible payment adjustments at the 5 percent, 6 percent and 8 percent rates that are being proposed in the model based on 2013-2014 data, providing information on the estimated impact of this proposed rule. We note that this impact analysis is based on the aggregate value of all 9 states identified in section IV.C.2. of this proposed rule by applying the proposed state selection methodology.

Table 26 displays our analysis of the distribution of possible payment adjustments based on 2013-2014 data, providing information on the estimated impact of this proposed rule. We note that this impact analysis is based on the aggregate value of all nine states (identified in section IV.C.2. of this proposed rule) by applying the proposed state selection methodology.

If our methodology is finalized as proposed, all Medicare-certified HHAs that provide services in Massachusetts, Maryland, North Carolina, Florida, Washington, Arizona, Iowa, Nebraska, and Tennessee will be required to compete in this model. However, should the methodology we propose in this rule change as a result of comments received during the rulemaking process, it could result in different states being selected for the model. In such an event, we would apply the final methodology and announce the selected states in the final rule. The estimates presented here may also change accordingly.

Value-based incentive payment adjustments for the estimated 1,900 plus HHAs in the proposed selected states that would compete in the HHVBP model are stratified by the size as defined in section F. For example, Arizona has 31 HHAs that do not provide services to enough beneficiaries to be required to complete CAHPS surveys and therefore are considered lower-volume under the proposed model. Using 2013-2014 data and the highest payment adjustment of 5 percent (which we propose to be applied in CYs 2021 and 2022), based on 10 process and outcome measures currently available on home health compare, the small HHAs in Arizona would have a mean payment adjustment of positive 0.64 percent. Only 10 percent of home health agencies would be subject to downward payment adjustments of more than −3.3 percent.

The next columns provide the distribution of scores by percentile; we see that the value-based incentive percentage payments for home health agencies in Arizona range from −3.3 percent at the 10th percentile to +5.0 percent at the 90th percentile, while the value-based incentive payment at the 50th percentile is 0.56 percent.

The smaller-volume HHA cohorts table identifies that some consideration will have to be made for MD, WA and TN where there are too few HHAs in the smaller-volume cohort and would be included in the larger-volume cohort without being measured on HHCAHPS.

Table 27 provides the payment adjustment distribution based on proportion of dual-eligible beneficiaries, average case mix (using HCC scores), proportion that reside in rural areas, as well as HHA organizational status. Besides the observation that higher proportion of dually-eligible beneficiaries serviced is related to better performance, the payment adjustment distribution is consistent with respect to these four categories.

The TPS score and the payment methodology at the state and size level were calculated so that each home health agency's payment adjustment was calculated as it would be in the model. Hence, the values of each separate analysis in the tables are representative of what they would be if the baseline year was 2013 and the performance year was 2014.

There were 1,931 HHAs in the nine selected states out of 1,991 HHAs that were found in the HHA data sources which yielded the sufficient measures to be included in the model. It is expected that a certain number of HHAs will not be subject to the payment adjustment because they may be servicing too small of a population to report on an adequate number of measures to calculate a TPS.

Table 25—Adjustment Distribution by Percentile level of Quality Total Performance Score at Different Model Payment Adjustment Rates

Payment adjustment distribution Range Lowest quality providers Highest quality providers
Lowest 10th pctile* 20th pctile* 30th pctile* 40th pctile* 50th pctile* 60th pctile* 70th pctile* 80th pctile* Highest 10th pctile*
5% Payment Adjustment for Year 1 and Year 2 of Model 7.69 −2.98 −2.04 −1.23 −0.54 0.15 0.83 1.74 3.08 4.71
6% Payment Adjustment for Year 3 of Model 9.24 −3.60 −2.46 −1.50 −0.66 0.18 1.02 2.10 3.72 5.64
8% Payment Adjustment for Year 4 and Year 5 of Model 12.31 −4.77 −3.27 −1.97 −0.86 0.25 1.33 2.78 4.92 7.54
*pctile = percentile

Table 26—HHA Cohort Payment Adjustment Distributions by State

[Based on a 5 percent payment adjustment]

State Number of HHAs Average payment adjustment (%) 10% 20% 30% 40% 50% 60% 70% 80% 90%
Smaller-Volume HHA Cohort by State
AZ 31 0.64 −3.33 −2.72 −2.17 −0.82 0.56 1.31 3.36 4.75 5.00
FL 353 0.44 −3.01 −1.76 −1.00 −0.39 0.21 0.94 1.84 3.04 4.38
IA 23 0.17 −3.14 −2.53 −2.01 −1.41 −0.97 0.31 2.74 3.25 5.00
MA 29 0.39 −3.68 −1.75 −0.70 −0.10 0.39 0.79 1.33 2.46 4.68
MD 2 −0.47 −2.71 −2.71 −2.71 −2.71 −0.47 1.78 1.78 1.78 1.78
NC 9 0.72 −2.38 −1.84 −1.41 −1.23 −0.68 0.34 3.67 5.00 5.00
NE 16 −0.51 −2.26 −1.80 −1.64 −1.43 −1.13 −0.44 0.40 0.42 1.46
TN 2 2.48 −0.05 −0.05 −0.05 −0.05 2.48 5.00 5.00 5.00 5.00
WA 1 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00 0.00
Larger-volume HHA Cohort by State
AZ 82 0.39 −3.31 −2.75 −2.19 −0.81 0.56 1.31 3.38 4.75 5.00
FL 672 0.41 −3.00 −1.75 −1.60 −0.38 0.19 0.94 1.81 3.06 4.38
IA 129 −0.31 −3.13 −2.31 −2.70 −1.13 −0.56 0.13 0.56 1.19 3.50
MA 101 0.64 −2.88 −2.19 −1.50 −0.38 0.63 1.25 2.06 3.81 4.88
MD 50 0.41 −2.75 −2.06 −2.30 −0.88 0.00 0.81 2.38 2.94 4.13
NC 163 0.65 −2.75 −1.56 −1.30 −0.06 0.38 0.94 1.88 3.06 4.88
NE 48 0.37 −2.63 −2.19 −1.40 −0.56 −0.19 0.50 1.31 2.31 5.00
TN 134 0.39 −2.56 −1.81 −2.00 −0.63 −0.06 0.81 1.44 2.50 4.69
WA 55 0.39 −2.75 −1.63 −2.00 −0.94 −0.19 0.69 1.94 3.31 4.06

Table 27—Payment Adjustment Distributions by Characteristics

[based on a 5 percent payment adjustment]

Percentage Dually-eligible Number of HHAs 10% 20% 30% 40% 50% 60% 70% 80% 90%
Low % Dually-eligible 498 −3.21 −2.57 −1.86 −1.29 −0.60 0.12 0.78 2.13 3.97
Medium % Dually-eligible 995 −2.91 −2.10 −1.33 −0.63 0.01 0.67 1.39 2.47 4.12
High % Dually-eligible 498 −2.46 −1.04 −0.24 0.59 1.29 2.34 3.38 4.53 5.00
Acuity (HCC):
Low Acuity 499 −2.83 −1.76 −0.94 −0.23 0.46 1.16 2.03 3.40 5.00
Middle acuity 993 −3.05 −2.08 −1.24 −0.50 0.19 0.90 1.71 2.81 4.51
High Acuity 499 −3.04 −2.04 −1.29 −0.51 0.26 1.06 2.00 3.16 4.91
% Rural Beneficiaries:
All non-rural 800 −2.81 −1.51 −0.66 0.08 0.78 1.54 2.64 3.94 5.00
Up to 35% rural 925 −3.12 −2.37 −1.71 −1.01 −0.42 0.32 1.18 2.24 3.97
over 35% rural 250 −2.91 −2.01 −1.17 −0.62 −0.11 0.56 1.32 2.86 4.58
Organizational Type:
Church 62 −2.92 −2.04 −1.33 −0.46 0.12 0.64 1.30 2.58 4.22
Private Not-For-Profit 194 −2.78 −1.74 −0.97 −0.42 0.27 0.85 1.77 2.89 4.55
Other 93 −2.62 −1.68 −0.95 −0.38 0.36 1.08 1.86 3.09 4.63
Private For-Profit 1538 −3.09 −2.08 −1.27 −0.53 0.24 1.02 1.88 3.02 4.83
Federal 83 −2.44 −1.61 −0.67 0.01 0.53 1.13 1.80 3.09 4.58
State 5 −3.03 −1.11 −.37 −0.01 0.24 0.42 1.66 2.96 3.24
Local 61 −2.30 −1.28 −0.48 0.16 0.98 1.91 2.88 4.11 5.00

D. Alternatives Considered

As described in section III.B.2 of this proposed rule, we considered proposing to reduce the national, standardized 60-day episode payment rate by 3.41 percent in CY 2016 to account for nominal case-mix growth between CY 2012 and CY 2014. If we were to reduce the national, standardized 60-day episode payment rate by 3.41 percent, we estimate that the aggregate impact would be a net decrease of $650 million in payments to HHAs, resulting from a $470 million decrease (−2.5 percent) due to the third year of the Affordable Care Act mandated rebasing adjustments, a $420 million increase (2.3 percent) due to the home health payment update percentage, and a $600 million decrease due to reducing the national, standardized 60-day episode payment rate by 3.41 percent. However, instead of proposing a one-time reduction in the national, standardized 60-day episode payment rate of 3.41 percent in CY 2016 to account for nominal case-mix growth from CY 2012 through CY 2014, we proposed to reduce the national, standardized 60-day episode payment rate by 1.72 percent in CY 2016 and 1.72 percent in CY 2017 to account for nominal case-mix growth from CY 2012 through CY 2014 as outlined in section III.B.2 of this proposed rule.

Section 3131(a) of the Affordable Care Act mandates that starting in CY 2014, the Secretary must apply an adjustment to the national, standardized 60-day episode payment rate and other amounts applicable under section 1895(b)(3)(A)(i)(III) of the Act to reflect factors such as changes in the number of visits in an episode, the mix of services in an episode, the level of intensity of services in an episode, the average cost of providing care per episode, and other relevant factors. In addition, section 3131(a) of the Affordable Care Act mandates that rebasing must be phased-in over a 4-year period in equal increments, not to exceed 3.5 percent of the amount (or amounts) as of the date of enactment (2010) under section 1895(b)(3)(A)(i)(III) of the Act, and be fully implemented in CY 2017. Therefore, in the CY 2014 HH PPS final rule (78 FR 77256), we finalized rebasing adjustments to the national, standardized 60-day episode payment amount, the national per-visit rates and the NRS conversion factor. As we noted in the CY 2014 HH PPS final rule, because section 3131(a) of the Affordable Care Act requires a four year phase-in of rebasing, in equal increments, to start in CY 2014 and be fully implemented in CY 2017, we do not have the discretion to delay, change, or eliminate the rebasing adjustments once we have determined that rebasing is necessary (78 FR 72283).

Section 1895(b)(3)(B) of the Act requires that the standard prospective payment amounts for CY 2016 be increased by a factor equal to the applicable HH market basket update for those HHAs that submit quality data as required by the Secretary. For CY 2016, section 3401(e) of the Affordable Care Act, requires that, in CY 2015 (and in subsequent calendar years), the market basket update under the HHA prospective payment system, as described in section 1895(b)(3)(B) of the Act, be annually adjusted by changes in economy-wide productivity. Beginning in CY 2015, section 1895(b)(3)(B)(vi)(I) of the Act, as amended by section 3401(e) of the Affordable Care Act, requires the application of the productivity adjustment described in section 1886(b)(3)(B)(xi)(II) of the Act to the HHA PPS for CY 2015 and each subsequent CY. The −0.6 percentage point productivity adjustment to the proposed CY 2016 home health market basket update (2.9 percent), is discussed in the preamble of this rule and is not discretionary as it is a requirement in section 1895(b)(3)(B)(vi)(I) of the Act (as amended by the Affordable Care Act).

We invite comments on the alternatives discussed in this analysis.

E. Accounting Statement and Table

As required by OMB Circular A-4 (available at http://www.whitehouse.gov/omb/circulars_a004_a-4 ), in Table 27, we have prepared an accounting statement showing the classification of the transfers and costs associated with the HH PPS provisions of this proposed rule. Table 27 provides our best estimate of the decrease in Medicare payments under the HH PPS as a result of the changes presented in this proposed rule for the HH PPS provisions.

Table 27—Accounting Statement: HH PPS Classification of Estimated Transfers and Costs, From the CYs 2015 to 2016 *

Category Transfers
Annualized Monetized Transfers −$350 million.
From Whom to Whom? Federal Government to HHAs.
* The estimates reflect 2016 dollars.

Table 28 provides our best estimate of the decrease in Medicare payments under the proposed HHVBP model.

Table 28—Accounting Statement: HHVBP Model Classification of Estimated Transfers and Costs for CY 2018-2022

Category Transfers
Annualized Monetized Transfers −$380 million.
From Whom to Whom? Federal Government to Hospitals and SNFs.

F. Conclusion

1. HH PPS

In conclusion, we estimate that the net impact of the HH PPS proposals in this rule is a decrease in Medicare payments to HHAs of $350 million for CY 2016. The $350 million decrease in estimated payments to HHAs for CY 2016 reflects the distributional effects of the 2.3 percent CY 2016 HH payment update percentage ($420 million increase), the proposed reduction to the national, standardized 60-day episode payment rate in CY 2016 of 1.72 percent to account for nominal case-mix growth ($300 million decrease), and the third year of the 4-year phase-in of the rebasing adjustments required by section 3131(a) of the Affordable Care Act of −2.5 percent ($470 million decrease). This analysis, together with the remainder of this preamble, provides an initial Regulatory Flexibility Analysis.

2. Proposed HHVBP Model

In conclusion, we estimate there will be no net impact of the proposals in this rule in Medicare payments to HHAs for CY 2016. However, the overall economic impact of the HHVBP model provision is an estimated $380 million in total savings from a reduction in unnecessary hospitalizations and SNF usage as a result of greater quality improvements in the HH industry over the life of the proposed model.

IX. Federalism Analysis

Executive Order 13132 on Federalism (August 4, 1999) establishes certain requirements that an agency must meet when it promulgates a final rule that imposes substantial direct requirement costs on state and local governments, preempts state law, or otherwise has Federalism implications. We have reviewed this proposed rule under the threshold criteria of Executive Order 13132, Federalism, and have determined that it will not have substantial direct effects on the rights, roles, and responsibilities of states, local or tribal governments.

List of Subjects

42 CFR Part 409

  • Health facilities
  • Medicare

42 CFR Part 424

  • Emergency medical services
  • Health facilities
  • Health professions
  • Medicare, and Reporting and recordkeeping requirements

42 CFR Part 484

  • Health facilities
  • Health professions
  • Medicare, and Reporting and recordkeeping requirements

For the reasons set forth in the preamble, the Centers for Medicare & Medicaid Services proposes to amend 42 CFR chapter IV as set forth below:

PART 409—HOSPITAL INSURANCE BENEFITS

1. The authority citation for part 409 continues to read as follows:

Authority: Secs. 1102 and 1871 of the Social Security Act (42 U.S.C. 1302 and 1395hh).

2. Section 409.43 is amended by revising paragraph (e)(1)(iii) to read as follows:

§ 409.43
Plan of care requirements.

(e) * * *

(1) * * *

(iii) Discharge with goals met and/or no expectation of a return to home health care and the patient returns to home health care during the 60 day episode.

PART 424—CONDITIONS FOR MEDICARE PAYMENT

3. The authority citation for part 424 continues to read as follows:

Authority: Secs. 1102 and 1871 of the Social Security Act (42 U.S.C. 1302 and 1395hh).

§ 424.22
[Amended]

4. Section 424.22 is amended by redesignating paragraph (a)(1)(v)(B)( 1) as paragraph (a)(2) and by removing reserved paragraph (a)(1)(v)(B)(2).

PART 484—HOME HEALTH SERVICES

5. The authority citation for part 484 continues to read as follows:

Authority: Secs 1102 and 1871 of the Social Security Act (42 U.S.C. 1302 and 1395(hh)) unless otherwise indicated.

6. Section 484.205 is amended by revising paragraphs (d) and (e) to read as follows:

§ 484.205
Basis of payment.

(d) Partial episode payment adjustment. (1) An HHA receives a national 60-day episode payment of a predetermined rate for home health services unless CMS determines an intervening event, defined as a beneficiary elected transfer or discharge with goals met or no expectation of return to home health and the beneficiary returned to home health during the 60-day episode, warrants a new 60-day episode for purposes of payment. A start of care OASIS assessment and physician certification of the new plan of care are required.

(2) The PEP adjustment will not apply in situations of transfers among HHAs of common ownership. Those situations will be considered services provided under arrangement on behalf of the originating HHA by the receiving HHA with the common ownership interest for the balance of the 60-day episode. The common ownership exception to the transfer PEP adjustment does not apply if the beneficiary moves to a different MSA or Non-MSA during the 60-day episode before the transfer to the receiving HHA. The transferring HHA in situations of common ownership not only serves as a billing agent, but must also exercise professional responsibility over the arranged-for services in order for services provided under arrangements to be paid.

(3) If the intervening event warrants a new 60-day episode payment and a new physician certification and a new plan of care, the initial HHA receives a partial episode payment adjustment reflecting the length of time the patient remained under its care. A partial episode payment adjustment is determined in accordance with § 484.235.

(e) Outlier payment. An HHA receives a national 60-day episode payment of a predetermined rate for a home health service, unless the imputed cost of the 60-day episode exceeds a threshold amount. The outlier payment is defined to be a proportion of the imputed costs beyond the threshold. An outlier payment is a payment in addition to the national 60-day episode payment. The total of all outlier payments is limited to no more than 2.5 percent of total outlays under the HHA PPS. An outlier payment is determined in accordance with § 484.240.

7. Section 484.220 is amended by revising paragraph (a)(3) and adding paragraphs (a)(4) through (6) to read as follows:

Calculation of the adjusted national prospective 60-day episode payment rate for case-mix and area wage levels.

(a) * * *

(3) For CY 2011, the adjustment is 3.79 percent.

(4) For CY 2012, the adjustment is 3.79 percent.

(5) For CY 2013, the adjustment is 1.32 percent.

(6) For CY 2016 and CY 2017, the adjustment is 1.72 percent in each year.

8. Section 484.225 is revised to read as follows:

§ 484.225
Annual update of the unadjusted national prospective 60-day episode payment rate.

(a) CMS updates the unadjusted national 60-day episode payment rate on a fiscal year basis (as defined in section 1895(b)(1)(B) of the Act).

(b) For 2007 and subsequent calendar years, in accordance with section 1895(b)(3)(B)(v) of the Act, in the case of a home health agency that submits home health quality data, as specified by the Secretary, the unadjusted national prospective 60-day episode rate is equal to the rate for the previous calendar year increased by the applicable home health market basket index amount.

(c) For 2007 and subsequent calendar years, in accordance with section 1895(b)(3)(B)(v) of the Act, in the case of a home health agency that does not submit home health quality data, as specified by the Secretary, the unadjusted national prospective 60-day episode rate is equal to the rate for the previous calendar year increased by the applicable home health market basket index amount minus 2 percentage points. Any reduction of the percentage change will apply only to the calendar year involved and will not be taken into account in computing the prospective payment amount for a subsequent calendar year.

§ 484.230
[Amended]

9. Section 484.230 is amended by removing the last sentence.

10. Section 484.240 is amended by revising paragraphs (b) and (e) and adding paragraph (f) to read as follows:

§ 484.240
Methodology used for the calculation of the outlier payment.

(b) The outlier threshold for each case-mix group is the episode payment amount for that group, or the PEP adjustment amount for the episode, plus a fixed dollar loss amount that is the same for all case-mix groups

(e) The fixed dollar loss amount and the loss sharing proportion are chosen so that the estimated total outlier payment is no more than 2.5 percent of total payment under home health PPS.

(f) The total amount of outlier payments to a specific home health agency for a year may not exceed an amount equal to 10 percent of the total payments to the specific agency under home health PPS for the year.

§ 484.245
[Removed and Reserved]

11. Section 484.245 is removed and reserved.

[Amended]

12. Section § 484.250(a)(2) is amended by removing the reference “§ 484.225(i)” and adding in its place the reference “§ 484.225(c)”.

13. Subpart F is added to read as follows:

Subpart F—Home Health Value-Based Purchasing (HHVBP) Model Components for Medicare-Certified Home Health Agencies Within State Boundaries
484.300
Basis and scope of subpart.
484.305
Definitions.
484.310
Applicability of the Home Health Value-Based Purchasing (HHVBP) model.
484.315
Data reporting for measures and evaluation under the Home Health Value-Based Purchasing (HHVBP) model.
484.320
Calculation of the Total Performance Score.
484.325
Payments for home health services under Home Health Value-Based Purchasing (HHVBP) model.
484.330
Process for determining and applying the value-based payment adjustment under the Home Health Value-Based Purchasing (HHVBP) model.

Subpart F—Home Health Value-Based Purchasing (HHVBP) Model Components for Medicare-Certified Home Health Agencies Within State Boundaries

§ 484.300
Basis and scope of subpart.

This subpart is established under section 1115A(a)(1) of the Act (42 U.S.C. 1315a), which authorizes the Secretary to test innovative payment and service delivery models to improve coordination, quality, and efficiency of health care services furnished under Title XVIII.

Definitions.

As used in this subpart—

Applicable measure means a measure for which the Medicare-certified HHA has provided 20 home health episodes of care per year.

Applicable percent means a maximum upward or downward adjustment for a given performance year, not to exceed the following:

(1) For CY 2018 and 2019, 5 percent.

(2) For CY 2020, 6 percent.

(3) For CY 2021 and 2022, 8 percent.

Benchmark refers to the mean of the top decile of Medicare-certified HHA performance on the specified quality measure during the baseline period, calculated separately for the larger-volume and smaller-volume cohorts within each state.

Home health prospective payment system (HH PPS) refers to the basis of payment for home health agencies as set forth in §§ 484.200 through 484.245.

Larger-volume cohort means the group of Medicare-certified home health agencies within the boundaries of selected states that are participating in HHCAHPs in accordance with § 484.250.

Linear exchange function is the means to translate a Medicare-certified HHA's Total Performance Score into a value-based payment adjustment percentage.

Medicare-certified home health agency means an agency:

(1) That has a current Medicare certification; and,

(2) Is being reimbursed by CMS for home health care delivered within any of the states specified in accordance with CMS's selection methodology.

New measures means those measures to be reported by Medicare-certified HHAs under the HHVBP model that are not otherwise reported by Medicare-certified HHAs to CMS and were identified to fill gaps to cover National Quality Strategy Domains not completely covered by existing measures in the home health setting.

Payment adjustment means the amount by which a Medicare-certified HHA's final claim payment amount under the HH PPS is changed in accordance with the methodology described in § 484.325.

Performance period means the time period during which data are collected for the purpose of calculating a Medicare-certified HHA's performance on measures.

Selected state(s) means those nine states that were randomly selected to compete/participate in the HHVBP model via a computer algorithm designed for random selection.

Smaller-volume cohort means the group of Medicare-certified home health agencies within the boundaries of selected states that are exempt from participation in HHCAHPs in accordance with § 484.250.

Starter set means the quality measures selected for the first year of this model.

Total Performance Score means the numeric score ranging from 0 to 100 awarded to each Medicare-certified HHA based on its performance under the HHVBP model.

Value-based purchasing means measuring, reporting, and rewarding excellence in health care delivery that takes into consideration quality, efficiency, and alignment of incentives. Effective health care services and high performing health care providers may be rewarded with improved reputations through public reporting, enhanced payments through differential reimbursements, and increased market share through purchaser, payer, and/or consumer selection.

§ 484.310
Applicability of the Home Health Value-Based Purchasing (HHVBP) model.

(a) General rule. The HHVBP model applies to all Medicare-certified home health agencies (HHAs) in selected states.

(b) Nine states are selected in accordance with CMS's selection methodology. All Medicare-certified HHAs that provide services in Massachusetts, Maryland, North Carolina, Florida, Washington, Arizona, Iowa, Nebraska, and Tennessee will be required to compete in this model.

§ 484.315
Data reporting for measures and evaluation under the Home Health Value-Based Purchasing (HHVBP) model.

(a) Medicare-certified home health agencies will be evaluated using a starter set of quality measures.

(b) Medicare-certified home health agencies in selected states will be required to report information on New Measures, as determined appropriate by the Secretary, to CMS in the form, manner, and at a time specified by the Secretary.

(c) Medicare-certified home health agencies in selected states will be required to collect and report such information as the Secretary determines is necessary for purposes of monitoring and evaluating the HHVBP model under section 1115A(b)(4) of the Act (42 U.S.C. 1315a).

§ 484.320
Calculation of the Total Performance Score.

A Medicare-certified home health agency's Total Performance Score for a model year is calculated as follows:

(a) CMS will award points to the Medicare-certified home health agency for performance on each of the applicable measures in the starter set, other than New Measures.

(b) CMS will award points to the Medicare-certified home health agency for reporting on each of the New Measures in the starter set, worth up to ten percent of the Total Performance Score.

(c) CMS will sum all points awarded for each applicable measure in the starter set, weighted equally at the individual measure level, to calculate a value worth up to 90 percent of the Total Performance Score.

(d) The sum of the points awarded to a Medicare-certified HHA for each applicable measure in the starter set and the points awarded to a Medicare-certified HHA for reporting data on each New Measure is the Medicare-certified HHA's Total Performance Score for the calendar year.

§ 484.325
Payments for home health services under Home Health Value-Based Purchasing (HHVBP) model.

CMS will determine a payment adjustment up to the maximum applicable percentage, upward or downward, under the HHVBP model for each Medicare-certified home health agency based on the agency's Total Performance Score using a linear exchange function. Payment adjustments made under the HHVBP model will be calculated as a percentage of otherwise-applicable payments for home health services provided under section 1895 of the Act (42 U.S.C. 1395fff).

§ 484.330
Process for determining and applying the payment adjustment under the Home Health Value-Based Purchasing (HHVBP) model.

(a) General. Medicare-certified home health agencies will be ranked within the larger-volume and smaller-volume cohorts in selected states based on the performance standards that apply to the HHVBP model for the baseline year, and CMS will make value-based payment adjustments to the Medicare-certified HHAs as specified in this section.

(b) Calculation of the value-based payment adjustment amount. The value-based payment adjustment amount is calculated by multiplying the Home Health Prospective Payment final claim payment amount as calculated in accordance with § 484.205 by the payment adjustment percentage.

(c) Calculation of the payment adjustment percentage. The payment adjustment percentage is calculated as the product of: The applicable percent as defined in § 484.320, the Medicare-certified HHA's Total Performance Score divided by 100, and the linear exchange function slope.

Dated: June 25, 2015.

Andrew M. Slavitt,

Administrator, Centers for Medicare & Medicaid Services.

Dated: June 26, 2015.

Sylvia M. Burwell,

Secretary.

[FR Doc. 2015-16790 Filed 7-6-15; 4:15 pm]

BILLING CODE 4120-01-P